A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others in most simulation scenarios according to the integrated mean square error and integrated classification error
The aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreThis research study focuses on fundamental questions. It deals with the content of the critical reviews through which the most important sources that undermine the renaissance of the Islamic human society can be concluded. It is the vague fetishization of the sacred between religion, politics, and ideas, which members of society live in and which raised the dust of closed-minded fanaticism that led to the return of deviant extremism. It is oppression, violence, and domination, all of which are professional tools in the manufacture of the ideology of extremism in all its forms. Here, the idea matured in giving a human, intellectual and philosophical overview that mixed the required drawing in the formulation of scientific cont
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreAn approximate solution of the liner system of ntegral cquations fot both fredholm(SFIEs)and Volterra(SIES)types has been derived using taylor series expansion.The solusion is essentailly
In this paper, the complexes of Shiff base of Methyl -6-[2-(diphenylmethylene)amino)-2-(4-hydroxyphenyl)acetamido]-2,2-dimethyl-5-oxo-1-thia-4-azabicyclo[3.2.0]heptane-3-carboxylate (L) with Cobalt(II), Nickel(II), Cupper(II) and Zinc(II) have been prepared. The compounds have been characterized by different means such as FT-IR, UV-Vis, magnetic moment, elemental microanalyses (C.H.N), atomic absorption, and molar conductance. It is obvious when looking at the spectral study that the overall complexes obtained as monomeric structure as well as the metals center moieties are two-coordinated with octahedral geometry excepting Co complexes that existed as a tetrahedral geometry. Hyper Chem-8.0.7
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show More<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c
... Show MoreIn high-dimensional semiparametric regression, balancing accuracy and interpretability often requires combining dimension reduction with variable selection. This study intro- duces two novel methods for dimension reduction in additive partial linear models: (i) minimum average variance estimation (MAVE) combined with the adaptive least abso- lute shrinkage and selection operator (MAVE-ALASSO) and (ii) MAVE with smoothly clipped absolute deviation (MAVE-SCAD). These methods leverage the flexibility of MAVE for sufficient dimension reduction while incorporating adaptive penalties to en- sure sparse and interpretable models. The performance of both methods is evaluated through simulations using the mean squared error and variable selection cri
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreIs in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.