Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the Maximum Likelihood method. Monte Carlo simulation was used with different skewness levels and sample sizes, and the superiority of the results was compared. It was concluded that (SND) model estimation using (GA) is the best when the samples sizes are small and medium, while large samples indicate that the (IR) algorithm is the best. The study was also done using real data to find the parameter estimation and a comparison between the superiority of the results based on (AIC, BIC, Mse and Def) criteria.
Abstract
Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al
... Show MoreImplementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.
In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreThe term "tight reservoir" is commonly used to refer to reservoirs with low permeability. Tight oil reservoirs have caused worry owing to its considerable influence upon oil output throughout the petroleum sector. As a result of its low permeability, producing from tight reservoirs presents numerous challenges. Because of their low permeability, producing from tight reservoirs is faced with a variety of difficulties. The research aim is to performing hydraulic fracturing treatment in single vertical well in order to study the possibility of fracking in the Saady reservoir. Iraq's Halfaya oil field's Saady B reservoir is the most important tight reservoir. The diagnostic fracture injection test is determined for HF55using GOHFER soft
... Show MoreQuality control is an effective statistical tool in the field of controlling the productivity to monitor and confirm the manufactured products to the standard qualities and the certified criteria for some products and services and its main purpose is to cope with the production and industrial development in the business and competitive market. Quality control charts are used to monitor the qualitative properties of the production procedures in addition to detecting the abnormal deviations in the production procedure. The multivariate Kernel Density Estimator control charts method was used which is one of the nonparametric methods that doesn’t require any assumptions regarding the distribution o
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Research deals the crises of the global recession of the facets of different and calls for the need to think out of the ordinary theory and find the arguments of the theory to accommodate the evolution of life, globalization and technological change and the standard of living of individuals and the size of the disparity in income distribution is not on the national level, but also at the global level as well, without paying attention to the potential resistance for thought the usual classical, Where the greater the returns of factors of production, the consumption will increase, and that the marginal propensity to consume may rise and the rise at rates greater with slices of low-income (the mouths of the poor) wi
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreThe gas-lift method is crucial for maintaining oil production, particularly from an established field when the natural energy of the reservoirs is depleted. To maximize oil production, a major field's gas injection rate must be distributed as efficiently as possible across its gas-lift network system. Common gas-lift optimization techniques may lose their effectiveness and become unable to replicate the gas-lift optimum in a large network system due to problems with multi-objective, multi-constrained & restricted gas injection rate distribution. The main objective of the research is to determine the possibility of using the genetic algorithm (GA) technique to achieve the optimum distribution for the continuous gas-lift injectio
... Show MoreGas-lift technique plays an important role in sustaining oil production, especially from a mature field when the reservoirs’ natural energy becomes insufficient. However, optimally allocation of the gas injection rate in a large field through its gas-lift network system towards maximization of oil production rate is a challenging task. The conventional gas-lift optimization problems may become inefficient and incapable of modelling the gas-lift optimization in a large network system with problems associated with multi-objective, multi-constrained, and limited gas injection rate. The key objective of this study is to assess the feasibility of utilizing the Genetic Algorithm (GA) technique to optimize t