The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To deal with this type of problem, a mixture of linear regression is used to model such data. In this article, we propose a genetic algorithm-based method combined with (MM-estimator), which is called in this article (RobGA), to improve the accuracy of the estimation in the final stage. We compare the suggested method with robust bi-square (MixBi) in terms of their application to real data representing blood sample. The results showed that RobGA is more efficient in estimating the parameters of the model than the MixBi method with respect to mean square error (MSE) and classification error (CE).
Wireless sensor networks (WSNs) represent one of the key technologies in internet of things (IoTs) networks. Since WSNs have finite energy sources, there is ongoing research work to develop new strategies for minimizing power consumption or enhancing traditional techniques. In this paper, a novel Gaussian mixture models (GMMs) algorithm is proposed for mobile wireless sensor networks (MWSNs) for energy saving. Performance evaluation of the clustering process with the GMM algorithm shows a remarkable energy saving in the network of up to 92%. In addition, a comparison with another clustering strategy that uses the K-means algorithm has been made, and the developed method has outperformed K-means with superior performance, saving ener
... Show MoreRecently Genetic Algorithms (GAs) have frequently been used for optimizing the solution of estimation problems. One of the main advantages of using these techniques is that they require no knowledge or gradient information about the response surface. The poor behavior of genetic algorithms in some problems, sometimes attributed to design operators, has led to the development of other types of algorithms. One such class of these algorithms is compact Genetic Algorithm (cGA), it dramatically reduces the number of bits reqyuired to store the poulation and has a faster convergence speed. In this paper compact Genetic Algorithm is used to optimize the maximum likelihood estimator of the first order moving avergae model MA(1). Simulation results
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreThe financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine
... Show MoreOften times, especially in practical applications, it is difficult to obtain data that is not tainted by a problem that may be related to the inconsistency of the variance of error or any other problem that impedes the use of the usual methods represented by the method of the ordinary least squares (OLS), To find the capabilities of the features of the multiple linear models, This is why many statisticians resort to the use of estimates by immune methods Especially with the presence of outliers, as well as the problem of error Variance instability, Two methods of horsepower were adopted, they are the robust weighted least square(RWLS)& the two-step robust weighted least square method(TSRWLS), and their performance was verifie
... Show MoreThe objective of this study is to examine the properties of Bayes estimators of the shape parameter of the Power Function Distribution (PFD-I), by using two different prior distributions for the parameter θ and different loss functions that were compared with the maximum likelihood estimators. In many practical applications, we may have two different prior information about the prior distribution for the shape parameter of the Power Function Distribution, which influences the parameter estimation. So, we used two different kinds of conjugate priors of shape parameter θ of the <
... Show MoreNowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef
... Show MoreIn this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreIn this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.