The increasing availability of computing power in the past two decades has been use to develop new techniques for optimizing solution of estimation problem. Today's computational capacity and the widespread availability of computers have enabled development of new generation of intelligent computing techniques, such as our interest algorithm, this paper presents one of new class of stochastic search algorithm (known as Canonical Genetic' Algorithm ‘CGA’) for optimizing the maximum likelihood function strategy is composed of three main steps: recombination, mutation, and selection. The experimental design is based on simulating the CGA with different values of are compared with those of moment method. Based on MSE value obtained from both methods
In this paper, some estimators of the unknown shape parameter and reliability function of Basic Gompertz distribution (BGD) have been obtained, such as MLE, UMVUE, and MINMSE, in addition to estimating Bayesian estimators under Scale invariant squared error loss function assuming informative prior represented by Gamma distribution and non-informative prior by using Jefferys prior. Using Monte Carlo simulation method, these estimators of the shape parameter and R(t), have been compared based on mean squared errors and integrated mean squared, respectively
In this paper, we present a comparison of double informative priors which are assumed for the parameter of inverted exponential distribution.To estimate the parameter of inverted exponential distribution by using Bayes estimation ,will be used two different kind of information in the Bayes estimation; two different priors have been selected for the parameter of inverted exponential distribution. Also assumed Chi-squared - Gamma distribution, Chi-squared - Erlang distribution, and- Gamma- Erlang distribution as double priors. The results are the derivations of these estimators under the squared error loss function with three different double priors.
Additionally Maximum likelihood estimation method
... Show MoreThe study of the validity and probability of failure in solids and structures is highly considered as one of the most incredibly-highlighted study fields in many science and engineering applications, the design analysts must therefore seek to investigate the points where the failing strains may be occurred, the probabilities of which these strains can cause the existing cracks to propagate through the fractured medium considered, and thereafter the solutions by which the analysts can adopt the approachable techniques to reduce/arrest these propagating cracks.In the present study a theoretical investigation upon simply-supported thin plates having surface cracks within their structure is to be accomplished, and the applied impact load to the
... Show MoreQuality control is an effective statistical tool in the field of controlling the productivity to monitor and confirm the manufactured products to the standard qualities and the certified criteria for some products and services and its main purpose is to cope with the production and industrial development in the business and competitive market. Quality control charts are used to monitor the qualitative properties of the production procedures in addition to detecting the abnormal deviations in the production procedure. The multivariate Kernel Density Estimator control charts method was used which is one of the nonparametric methods that doesn’t require any assumptions regarding the distribution o
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreAn Optimal Algorithm for HTML Page Building Process
This paper proposes a novel meta-heuristic optimization algorithm called the fine-tuning meta-heuristic algorithm (FTMA) for solving global optimization problems. In this algorithm, the solutions are fine-tuned using the fundamental steps in meta-heuristic optimization, namely, exploration, exploitation, and randomization, in such a way that if one step improves the solution, then it is unnecessary to execute the remaining steps. The performance of the proposed FTMA has been compared with that of five other optimization algorithms over ten benchmark test functions. Nine of them are well-known and already exist in the literature, while the tenth one is proposed by the authors and introduced in this article. One test trial was shown t
... Show MoreIn this research a new system identification algorithm is presented for obtaining an optimal set of mathematical models for system with perturbed coefficients, then this algorithm is applied practically by an “On Line System Identification Circuit”, based on real time speed response data of a permanent magnet DC motor. Such set of mathematical models represents the physical plant against all variation which may exist in its parameters, and forms a strong mathematical foundation for stability and performance analysis in control theory problems.
In many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show More