The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree model. Having been in this research compare these methods form a model for additive function to some nonparametric function. It was a trade-off between these process models based on the classification accuracy by misclassification error, and estimation accuracy by the root of the mean squares error: RMSE. It was the application on patients with diabetes data for those aged 15 years and below are taken from the sample size (200) was withdrawn from the Children Hospital in Al-Eskan / Baghdad.
Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreAbstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreThe unstable and uncertain nature of natural rubber prices makes them highly volatile and prone to outliers, which can have a significant impact on both modeling and forecasting. To tackle this issue, the author recommends a hybrid model that combines the autoregressive (AR) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models. The model utilizes the Huber weighting function to ensure the forecast value of rubber prices remains sustainable even in the presence of outliers. The study aims to develop a sustainable model and forecast daily prices for a 12-day period by analyzing 2683 daily price data from Standard Malaysian Rubber Grade 20 (SMR 20) in Malaysia. The analysis incorporates two dispersion measurements (I
... Show MoreIn this paper, we are mainly concerned with estimating cascade reliability model (2+1) based on inverted exponential distribution and comparing among the estimation methods that are used . The maximum likelihood estimator and uniformly minimum variance unbiased estimators are used to get of the strengths and the stress ;k=1,2,3 respectively then, by using the unbiased estimators, we propose Preliminary test single stage shrinkage (PTSSS) estimator when a prior knowledge is available for the scale parameter as initial value due past experiences . The Mean Squared Error [MSE] for the proposed estimator is derived to compare among the methods. Numerical results about conduct of the considered
... Show MoreNon-additive measures and corresponding integrals originally have been introduced by Choquet in 1953 (1) and independently defined by Sugeno in 1974 (2) in order to extend the classical measure by replacing the additivity property to non-additive property. An important feature of non –additive measures and fuzzy integrals is that they can represent the importance of individual information sources and interactions among them. There are many applications of non-additive measures and fuzzy integrals such as image processing, multi-criteria decision making, information fusion, classification, and pattern recognition. This paper presents a mathematical model for discussing an application of non-additive measures and corresp
... Show More In this paper the research represents an attempt of expansion in using the parametric and non-parametric estimators to estimate the median effective dose ( ED50 ) in the quintal bioassay and comparing between these methods . We have Chosen three estimators for Comparison. The first estimator is
( Spearman-Karber ) and the second estimator is ( Moving Average ) and The Third estimator is ( Extreme Effective Dose ) . We used a minimize Chi-square as a parametric method. We made a Comparison for these estimators by calculating the mean square error of (ED50) for each one of them and comparing it with the optimal the mean square
The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show MoreIn the current paper, the effect of fear in three species Beddington–DeAngelis food chain model is investigated. A three species food chain model incorporating Beddington-DeAngelis functional response is proposed, where the growth rate in the first and second level decreases due to existence of predator in the upper level. The existence, uniqueness and boundedness of the solution of the model are studied. All the possible equilibrium points are determined. The local as well as global stability of the system are investigated. The persistence conditions of the system are established. The local bifurcation analysis of the system is carried out. Finally, numerical simulations are used t
In this article we study a single stochastic process model for the evaluate the assets pricing and stock.,On of the models le'vy . depending on the so –called Brownian subordinate as it has been depending on the so-called Normal Inverse Gaussian (NIG). this article aims as the estimate that the parameters of his model using my way (MME,MLE) and then employ those estimate of the parameters is the study of stock returns and evaluate asset pricing for both the united Bank and Bank of North which their data were taken from the Iraq stock Exchange.
which showed the results to a preference MLE on MME based on the standard of comparison the average square e
... Show MoreCox regression model have been used to estimate proportion hazard model for patients with hepatitis disease recorded in Gastrointestinal and Hepatic diseases Hospital in Iraq for (2002 -2005). Data consists of (age, gender, survival time terminal stat). A Kaplan-Meier method has been applied to estimate survival function and hazerd function.