In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators.
This research aims to estimate stock returns, according to the Rough Set Theory approach, test its effectiveness and accuracy in predicting stock returns and their potential in the field of financial markets, and rationalize investor decisions. The research sample is totaling (10) companies traded at Iraq Stock Exchange. The results showed a remarkable Rough Set Theory application in data reduction, contributing to the rationalization of investment decisions. The most prominent conclusions are the capability of rough set theory in dealing with financial data and applying it for forecasting stock returns.The research provides those interested in investing stocks in financial
... Show MoreThe research aims to apply one of the techniques of management accounting, which is the technique of the quality function deployment on the men's leather shoe product Model (79043) in the General Company for Textile and Leather Industries by determining the basic requirements of the customer and then designing the characteristics and specifications of the product according to the preferences of the customer in order to respond to the customer's voice in agreement With the characteristics and technical characteristics of the product, taking into account the products of the competing companies to achieve the maximum customer satisfaction, the highest quality and the lowest costs. Hence, the importance of research has emerged, which indicat
... Show More
Shear and compressional wave velocities, coupled with other petrophysical data, are vital in determining the dynamic modules magnitude in geomechanical studies and hydrocarbon reservoir characterization. But, due to field practices and high running cost, shear wave velocity may not available in all wells. In this paper, a statistical multivariate regression method is presented to predict the shear wave velocity for Khasib formation - Amara oil fields located in South- East of Iraq using well log compressional wave velocity, neutron porosity and density. The accuracy of the proposed correlation have been compared to other correlations. The results show that, the presented model provides accurate
... Show Moreالمستخلص:
في هذا البحث , استعملنا طرائق مختلفة لتقدير معلمة القياس للتوزيع الاسي كمقدر الإمكان الأعظم ومقدر العزوم ومقدر بيز في ستة أنواع مختلفة عندما يكون التوزيع الأولي لمعلمة القياس : توزيع لافي (Levy) وتوزيع كامبل من النوع الثاني وتوزيع معكوس مربع كاي وتوزيع معكوس كاما وتوزيع غير الملائم (Improper) وتوزيع
... Show MoreThe 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreGhrelin and leptin are hunger hormones related to type 2 diabetes mellitus (T2DM), and the pathogenesis of T2DM is the abnormality in insulin secretion and insulin resistance (IR). The aim of this study is to evaluate ghrelin and leptin concentrations in blood and to specify the relationship of these hormones as dependent variables with some biochemical and clinical measurements in T2DM patients. In this study, forty one T2DM and forty three non-diabetes mellitus (non-DM) subjects, aged between 40-60 years and with normal weight, were enrolled. Fasting serum ghrelin and leptin were estimated by enzyme-linked immunosorbent assay (ELISA). In our results ghrelin was significantly increased, and leptin was significantly decreased, in T2DM pa
... Show MoreAbstract
Characterized by the Ordinary Least Squares (OLS) on Maximum Likelihood for the greatest possible way that the exact moments are known , which means that it can be found, while the other method they are unknown, but approximations to their biases correct to 0(n-1) can be obtained by standard methods. In our research expressions for approximations to the biases of the ML estimators (the regression coefficients and scale parameter) for linear (type 1) Extreme Value Regression Model for Largest Values are presented by using the advanced approach depends on finding the first derivative, second and third.
In this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we g
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show More