Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colony algorithm or bees algorithm or a swarm of birds and other originally used algorithm for the purposes of technology pertaining to distinguish between images or signals and others can be illustrated to serve the Census and check successful at it. So the choice fell on the genetic algorithm which often applied in the biology science on the subject of the analysis of DNA and genetic engineering within the modern trends of Medical Science. Proposal genetic algorithm was developed, along with C4.5 algorithm. Having been in this research integrating the work of all these algorithms mechanism Generalized Additive model to estimate some nonparametric function. Simulation was used to demonstrate the classification optimization using misclassification error and prove estimation optimization by the root mean of squares error: RMSE. The simulation has to experiment samples sizes (200, 400, 600) and (1000) replications
This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show MoreThe aim of the thesis is to estimate the partial and inaccessible population groups, which is a field study to estimate the number of drug’s users in the Baghdad governorate for males who are (15-60) years old.
Because of the absence of data approved by government institutions, as well as the difficulty of estimating the numbers of these people from the traditional survey, in which the respondent expresses himself or his family members in some cases. In these challenges, the NSUM Network Scale-Up Method Is mainly based on asking respondents about the number of people they know in their network of drug addicts.
Based on this principle, a statistical questionnaire was designed to
... Show MoreSome maps of the chaotic firefly algorithm were selected to select variables for data on blood diseases and blood vessels obtained from Nasiriyah General Hospital where the data were tested and tracking the distribution of Gamma and it was concluded that a Chebyshevmap method is more efficient than a Sinusoidal map method through mean square error criterion.
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreThis research aims to predict the value of the maximum daily loss that the fixed-return securities portfolio may suffer in Qatar National Bank - Syria, and for this purpose data were collected for risk factors that affect the value of the portfolio represented by the time structure of interest rates in the United States of America over the extended period Between 2017 and 2018, in addition to data related to the composition of the bonds portfolio of Qatar National Bank of Syria in 2017, And then employing Monte Carlo simulation models to predict the maximum loss that may be exposed to this portfolio in the future. The results of the Monte Carlo simulation showed the possibility of decreasing the value at risk in the future due to the dec
... Show MoreThe most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.