A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThis paper aims at presenting a comparison between objective and subjective tests . This paper attemptsto shed light on these two aspects of tests and make do a compression by using suitable techniques for objective and subjective tests .
The paper compares between the two techniques used by the objective and subjective tests respectively, the time and efforts required by each type, the extent to which each type can be reliable, and the skills each type is suitable to measure.
The paper shows that objective tests, on the contrary of the subjective ones, encourages guess> Objective tests are used to test specific areas of langua
... Show MoreBackground: lip lengthening procedure is one of the surgical options for the correction of gummy smile in patients with short upper lip. Methods: A comparative clinical study was conducted on 15 patients requiring lip lengthening procedure for the esthetic correction of excessive gingival exposure with gummy smile. Scalpel was used in seven patients and diode laser in the remaining eight patients. Under infiltration anesthesia, about one cm strip of mucosa was excised at the vestibular depth and the mucosa of the lip was sutured to the alveolar mucosa. Results: The diode laser group demonstrated less postoperative pain and swelling. Regarding postoperative ecchymosis, three patients in the scalpel group developed ecchymosis and no cases
... Show MoreThe current study was designed to compare some of the vital markers in the sera of diabetic and neuropathy patients via estimating Adipsin, Fasting blood Glucose(FBG), Glycated(HbA1c) hemoglobin, Homeostasis Model Assessment Index (Homa IR ), Cholesterol, High density lipoprotein (HDL), Triglycerides (T.G), Low-density, and lipoprotein (LDL), Very Low Density Lipoprotein (VLDL), in sera of Iraqi patients with diabetes and neuropathy. A total of ninety subjects were divided into three groups: group I (30 diabetic with neuropathy males) and group II (30 diabetic males without neuropathy), and 30 healthy sujects were employed as control group. The results showed a significant decline in Adipsin levels (p>0.05) in neuropathy, T2DM g
... Show MoreNever the less, banking compliance function became one of the most important functions in banking sector according to its characteristics that considered as an interior control tools to control (executive management, departments, subsidiaries…etc) in any bank; and their compliance towards applying rules, recommendations and legislations. In addition to, estimating the risks and limited them; and controlling the anti-money laundering. Thus, these functions that covered the main concept of (Banking Compliance) would avoid the bank to be under the control of any sanctions.
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreAbstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show MoreThe Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show More