A Strength Pareto Evolutionary Algorithm 2 (SPEA 2) approach for solving the multi-objective Environmental / Economic Power Dispatch (EEPD) problem is presented in this paper. In the past fuel cost consumption minimization was the aim (a single objective function) of economic power dispatch problem. Since the clean air act amendments have been applied to reduce SO2 and NOX emissions from power plants, the utilities change their strategies in order to reduce pollution and atmospheric emission as well, adding emission minimization as other objective function made economic power dispatch (EPD) a multi-objective problem having conflicting objectives. SPEA2 is the improved version of SPEA with better fitness assignment, density estimation, and modified archive truncation. In addition fuzzy set theory is employed to extract the best compromise solution. Several optimization run of the proposed method are carried out on 3-units system and 6-units standard IEEE 30-bus test system. The results demonstrate the capabilities of the proposed method to generate well-distributed Pareto-optimal non-dominated feasible solutions in single run. The comparison with other multi-objective methods demonstrates the superiority of the proposed method.
The main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreAbstract
Objective of this research focused on testing the impact of internal corporate governance instruments in the management of working capital and the reflection of each of them on the Firm performance. For this purpose, four main hypotheses was formulated, the first, pointed out its results to a significant effect for each of corporate major shareholders ownership and Board of Directors size on the net working capital and their association with a positive relation. The second, explained a significant effect of net working capital on the economic value added, and their link inverse relationship, while the third, explored a significant effect for each of the corporate major shareholders ownershi
... Show MoreThe monetary policy is a vital method used in implementing monetary stability through: the management of income and adjustment of the price (monetary targets) in order to promote stability and growth of real output (non-cash goals); the tool of interest rate and direct investment guides or movement towards the desired destination; and supervisory instruments of monetary policy in both quantitative and qualitative. The latter is very important as a standard compass to investigate the purposes of the movement monetary policy in the economy. The public and businesses were given monetary policy signals by those tools. In fiscal policy, there are specific techniques to follow to do the spending and collection of revenue. This is done in order to
... Show MoreIn this paper, estimation of system reliability of the multi-components in stress-strength model R(s,k) is considered, when the stress and strength are independent random variables and follows the Exponentiated Weibull Distribution (EWD) with known first shape parameter θ and, the second shape parameter α is unknown using different estimation methods. Comparisons among the proposed estimators through Monte Carlo simulation technique were made depend on mean squared error (MSE) criteria
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIn this paper, we used maximum likelihood method and the Bayesian method to estimate the shape parameter (θ), and reliability function (R(t)) of the Kumaraswamy distribution with two parameters l , θ (under assuming the exponential distribution, Chi-squared distribution and Erlang-2 type distribution as prior distributions), in addition to that we used method of moments for estimating the parameters of the prior distributions. Bayes
The research aims to improve the effectiveness of internal control system according to a model COSO, by identifying the availability of system components according to the model and then improve the effectiveness of each component by focusing on areas for improvement in each component, as it was addressed to a model COSO and then Maamth with the environment, the current Iraqi by introducing some improvements on the form of some mechanisms of corporate governance of the Council of Directors, and senior management, the Audit Committee, Committee appointments, especially that supplies application available in the laws and legislation, the current Iraqi, taking into consideration to make some
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreAbstract
We produced a study in Estimation for Reliability of the Exponential distribution based on the Bayesian approach. These estimates are derived using Bayesian approaches. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .we derived bayes estimators of reliability under four types when the prior distribution for the scale parameter of the Exponential distribution is: Inverse Chi-squar
... Show More