Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the Maximum Likelihood method. Monte Carlo simulation was used with different skewness levels and sample sizes, and the superiority of the results was compared. It was concluded that (SND) model estimation using (GA) is the best when the samples sizes are small and medium, while large samples indicate that the (IR) algorithm is the best. The study was also done using real data to find the parameter estimation and a comparison between the superiority of the results based on (AIC, BIC, Mse and Def) criteria.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreThe lossy-FDNR based aclive fil ter has an important property among many design realizations. 'This includes a significant reduction in component count particularly in the number of OP-AMP which consumes power. However the· problem of this type is the large component spreads which affect the fdter performance.
In this paper Genetic Algorithm is applied to minimize the component spread (capacitance and resistance p,read). The minimization of these spreads allow the fil
... Show MoreMost heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreNowadays, information systems constitute a crucial part of organizations; by losing security, these organizations will lose plenty of competitive advantages as well. The core point of information security (InfoSecu) is risk management. There are a great deal of research works and standards in security risk management (ISRM) including NIST 800-30 and ISO/IEC 27005. However, only few works of research focus on InfoSecu risk reduction, while the standards explain general principles and guidelines. They do not provide any implementation details regarding ISRM; as such reducing the InfoSecu risks in uncertain environments is painstaking. Thus, this paper applied a genetic algorithm (GA) for InfoSecu risk reduction in uncertainty. Finally, the ef
... Show MoreFor businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers
... Show MoreThe research presents the reliability. It is defined as the probability of accomplishing any part of the system within a specified time and under the same circumstances. On the theoretical side, the reliability, the reliability function, and the cumulative function of failure are studied within the one-parameter Raleigh distribution. This research aims to discover many factors that are missed the reliability evaluation which causes constant interruptions of the machines in addition to the problems of data. The problem of the research is that there are many methods for estimating the reliability function but no one has suitable qualifications for most of these methods in the data such
The Rivest–Shamir–Adleman (RSA) and the Diffie-Hellman (DH) key exchange are famous methods for encryption. These methods depended on selecting the primes p and q in order to be secure enough . This paper shows that the named methods used the primes which are found by some arithmetical function .In the other sense, no need to think about getting primes p and q and how they are secure enough, since the arithmetical function enable to build the primes in such complicated way to be secure. Moreover, this article gives new construction of the RSA algorithm and DH key exchange using the
primes p,qfrom areal number x.