The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree model. Having been in this research compare these methods form a model for additive function to some nonparametric function. It was a trade-off between these process models based on the classification accuracy by misclassification error, and estimation accuracy by the root of the mean squares error: RMSE. It was the application on patients with diabetes data for those aged 15 years and below are taken from the sample size (200) was withdrawn from the Children Hospital in Al-Eskan / Baghdad.
Gray-Scale Image Brightness/Contrast Enhancement with Multi-Model
Histogram linear Contrast Stretching (MMHLCS) method
The internet, unlike other traditional means of communication, has a flexibility to stimulate the user and allows him to develop it. Perhaps, the reason for the superiority of the internet over other traditional means of communication is the possibility of change and transmission from one stage to another in a short period. This means that the internet is able to move from the use to the development of the use and then the development of means and innovation as the innovation of the internet is a logical product of the interaction of the user with the network. The internet invests all the proposals and ideas and does not ignore any even if it is simple. This is represented in social networking sites which in fact reflects personal emotio
... Show MoreAbstract
This work involves the manufacturing of MAX phase materials include V2AlC and Cr2AlC using powder metallurgy as a new class of materials which characterized by regular crystals in lattice. Corrosion behavior of these materials was investigated by Potentiostat to estimate corrosion resistance and compared with the most resistant material represented by SS 316L. The experiments were carried out in 0.01N of NaOH solution at four temperatures in the range of 30–60oC. Polarization resistance values which calculated by Stern-Geary equation indicated that the MAX phase materials more resistant than SS 316L. Also cyclic polarization tests confirme
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show More
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreThe objective of this study is to examine the properties of Bayes estimators of the shape parameter of the Power Function Distribution (PFD-I), by using two different prior distributions for the parameter θ and different loss functions that were compared with the maximum likelihood estimators. In many practical applications, we may have two different prior information about the prior distribution for the shape parameter of the Power Function Distribution, which influences the parameter estimation. So, we used two different kinds of conjugate priors of shape parameter θ of the <
... Show MoreThis study included the extraction properties of spatial and morphological basins studied using the Soil and Water Assessment Tool (SWAT) model linked to (GIS) to find the amount of sediment and rates of flow that flows into the Haditha reservoir . The aim of this study is determine the amount of sediment coming from the valleys and flowing into the Haditha Dam reservoir for 25 years ago for the period (1985-2010) and its impact on design lifetime of the Haditha Dam reservoir and to determine the best ways to reduce the sediment transport. The result indicated that total amount of sediment coming from all valleys about (2.56 * 106 ton). The maximum annual total sediment load was about (488.22 * 103 ton) in year 1988
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreNon-additive measures and corresponding integrals originally have been introduced by Choquet in 1953 (1) and independently defined by Sugeno in 1974 (2) in order to extend the classical measure by replacing the additivity property to non-additive property. An important feature of non –additive measures and fuzzy integrals is that they can represent the importance of individual information sources and interactions among them. There are many applications of non-additive measures and fuzzy integrals such as image processing, multi-criteria decision making, information fusion, classification, and pattern recognition. This paper presents a mathematical model for discussing an application of non-additive measures and corresp
... Show MoreThe unstable and uncertain nature of natural rubber prices makes them highly volatile and prone to outliers, which can have a significant impact on both modeling and forecasting. To tackle this issue, the author recommends a hybrid model that combines the autoregressive (AR) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) models. The model utilizes the Huber weighting function to ensure the forecast value of rubber prices remains sustainable even in the presence of outliers. The study aims to develop a sustainable model and forecast daily prices for a 12-day period by analyzing 2683 daily price data from Standard Malaysian Rubber Grade 20 (SMR 20) in Malaysia. The analysis incorporates two dispersion measurements (I
... Show More