يھدف البحث الى اجراء تقدير دالة المعولية لتوزيــع ويبل ذي المعلمتين بالطرائـق المعلميــة والمتمثلة بـ (NWLSM,RRXM,RRYM,MOM,MLM (، وكذلك اجراء تقدير لدالة المعولية بالطرائق الالمعلمية والمتمثلة بـ . (EM, PLEM, EKMEM, WEKM, MKMM, WMR, MMO, MMT) وتم استخدام اسلوب المحاكاة لغرض المقارنة باستخدام حجوم عينات مختلفة (20,40,60,80,100) والوصول الى افضل الطرائق في التقدير باالعتماد على المؤشر االحصائي متوسط مربعات الخطا التكاملي (IMSE(، وقد توصل البحث الى ايجاد وزن مقترح معدل (1)، و وزن مقترح معدل (2) لطريقة مقدر كابلن مير التجريبي الموزون (WEKM(، وقد توصل البحث الى ان افضل طريقة معلمية لتقدير دالة المعولية ھي طريقة (االمكان االعظم (MLM((، وبالنسبة الفضل طريقة الالمعلمية ھي طريقة (طرائق التجريب (EM((.
The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreThe root-mean square-radius of proton, neutron, matter and charge radii, energy level, inelastic longitudinal form factors, reduced transition probability from the ground state to first-excited 2+ state of even-even isotopes, quadrupole moments, quadrupole deformation parameter, and the occupation numbers for some calcium isotopes for A=42,44,46,48,50 are computed using fp-model space and FPBM interaction. 40Ca nucleus is regarded as the inert core for all isotopes under this model space with valence nucleons are moving throughout the fp-shell model space involving 1f7/2, 2p3/2, 1f5/2, and 2p1/2 orbits. Model space is used to present calculations using FPBM intera
... Show MoreThe way used to estimate the fuzzy reliability differs according to the nature of the information of failure time which has been dealt in this research.The information of failure times has no probable distribution to explain it , in addition it has fuzzy quality.The research includes fuzzy reliability estimation of three periods ,the first one from 1986 to 2013,the second one from 2013 to 2033 while the third one from 2033 to 2066 .Four failure time have been chosen to identify the membership function of fuzzy trapezoid represented in the pervious years after taking in consideration the estimation of most researchers, proffional geologists and the technician who is incharge of maintaining of Mosul Dam project. B
... Show MoreThis work, deals with Kumaraswamy distribution. Kumaraswamy (1976, 1978) showed well known probability distribution functions such as the normal, beta and log-normal but in (1980) Kumaraswamy developed a more general probability density function for double bounded random processes, which is known as Kumaraswamy’s distribution. Classical maximum likelihood and Bayes methods estimator are used to estimate the unknown shape parameter (b). Reliability function are obtained using symmetric loss functions by using three types of informative priors two single priors and one double prior. In addition, a comparison is made for the performance of these estimators with respect to the numerical solution which are found using expansion method. The
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreGross domestic product (GDP) is an important measure of the size of the economy's production. Economists use this term to determine the extent of decline and growth in the economies of countries. It is also used to determine the order of countries and compare them to each other. The research aims at describing and analyzing the GDP during the period from 1980 to 2015 and for the public and private sectors and then forecasting GDP in subsequent years until 2025. To achieve this goal, two methods were used: linear and nonlinear regression. The second method in the time series analysis of the Box-Jenkins models and the using of statistical package (Minitab17), (GRETLW32)) to extract the results, and then comparing the two methods, T
... Show More