A Strength Pareto Evolutionary Algorithm 2 (SPEA 2) approach for solving the multi-objective Environmental / Economic Power Dispatch (EEPD) problem is presented in this paper. In the past fuel cost consumption minimization was the aim (a single objective function) of economic power dispatch problem. Since the clean air act amendments have been applied to reduce SO2 and NOX emissions from power plants, the utilities change their strategies in order to reduce pollution and atmospheric emission as well, adding emission minimization as other objective function made economic power dispatch (EPD) a multi-objective problem having conflicting objectives. SPEA2 is the improved version of SPEA with better fitness assignment, density estimation, and modified archive truncation. In addition fuzzy set theory is employed to extract the best compromise solution. Several optimization run of the proposed method are carried out on 3-units system and 6-units standard IEEE 30-bus test system. The results demonstrate the capabilities of the proposed method to generate well-distributed Pareto-optimal non-dominated feasible solutions in single run. The comparison with other multi-objective methods demonstrates the superiority of the proposed method.
Sixteen polycyclic aromatic hydrocarbons (PAHs) concentrations were measured in aerosol samples collected for the period from April 2012 to February 2013 at thermal south power station of Baghdad. Fourty one aerosol sample were extracted with (1:1) dichloromethane and methanol using soxhlet for seventeen hour. The extraction solution was analyzed applying GC/MS. The PAH concentrations outside thermal south power station were higher than those inside it, and higher in summer season than in winter. Naphthalene, pyrene, Anthracene, Indeno [1, 2, 3-cd] pyrene and Phenanthrene were the most abundant PAHs detected in all points at the site sampling. The total polycyclic aromatic hydrocarbon (TPAH) and total suspended particles (TSP) concentrat
... Show MoreThis work presents an approach for the applying Triple DES (TRIPLE DES) based on using genetic algorithm by adding intelligent feature for TRIPLE DES with N round for genetic algorithm. Encapsulated cipher file with special program which send an acknowledgment to a sender to know who decipher or broken to crash it , Thus it is considered as the initial step to improve privacy. The outcome for proposed system gives a good indication that it is a promising system compared with other type of cipher system.
By definition, the detection of protein complexes that form protein-protein interaction networks (PPINs) is an NP-hard problem. Evolutionary algorithms (EAs), as global search methods, are proven in the literature to be more successful than greedy methods in detecting protein complexes. However, the design of most of these EA-based approaches relies on the topological information of the proteins in the PPIN. Biological information, as a key resource for molecular profiles, on the other hand, acquired a little interest in the design of the components in these EA-based methods. The main aim of this paper is to redesign two operators in the EA based on the functional domain rather than the graph topological domain. The perturb
... Show MoreTraffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-ho
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreBecause of the rapid development and use of the Internet as a communication media emerged to need a high level of security during data transmission and one of these ways is "Steganography". This paper reviews the Least Signification Bit steganography used for embedding text file with related image in gray-scale image. As well as we discuss the bit plane which is divided into eight different images when combination them we get the actual image. The findings of the research was the stego-image is indistinguishable to the naked eye from the original cover image when the value of bit less than four Thus we get to the goal is to cover up the existence of a connection or hidden data. The Peak to Signal Noise Ratio(PSNR) and Mean Square Error (
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreIn this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.