In this research a new system identification algorithm is presented for obtaining an optimal set of mathematical models for system with perturbed coefficients, then this algorithm is applied practically by an “On Line System Identification Circuit”, based on real time speed response data of a permanent magnet DC motor. Such set of mathematical models represents the physical plant against all variation which may exist in its parameters, and forms a strong mathematical foundation for stability and performance analysis in control theory problems.
In this study, genetic algorithm was used to predict the reaction kinetics of Iraqi heavy naphtha catalytic reforming process located in Al-Doura refinery in Baghdad. One-dimensional steady state model was derived to describe commercial catalytic reforming unit consisting of four catalytic reforming reactors in series process.
The experimental information (Reformate composition and output temperature) for each four reactors collected at different operating conditions was used to predict the parameters of the proposed kinetic model. The kinetic model involving 24 components, 1 to 11 carbon atoms for paraffins and 6 to 11 carbon atom for naphthenes and aromatics with 71 reactions. The pre-exponential Arrhenius constants and a
... Show MoreBecause of the rapid development and use of the Internet as a communication media emerged to need a high level of security during data transmission and one of these ways is "Steganography". This paper reviews the Least Signification Bit steganography used for embedding text file with related image in gray-scale image. As well as we discuss the bit plane which is divided into eight different images when combination them we get the actual image. The findings of the research was the stego-image is indistinguishable to the naked eye from the original cover image when the value of bit less than four Thus we get to the goal is to cover up the existence of a connection or hidden data. The Peak to Signal Noise Ratio(PSNR) and Mean Square Error (
... Show MoreCryptography can be thought of as a toolbox, where potential attackers gain access to various computing resources and technologies to try to compute key values. In modern cryptography, the strength of the encryption algorithm is only determined by the size of the key. Therefore, our goal is to create a strong key value that has a minimum bit length that will be useful in light encryption. Using elliptic curve cryptography (ECC) with Rubik's cube and image density, the image colors are combined and distorted, and by using the Chaotic Logistics Map and Image Density with a secret key, the Rubik's cubes for the image are encrypted, obtaining a secure image against attacks. ECC itself is a powerful algorithm that generates a pair of p
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreToday’s academics have a major hurdle in solving combinatorial problems in the actual world. It is nevertheless possible to use optimization techniques to find, design, and solve a genuine optimal solution to a particular problem, despite the limitations of the applied approach. A surge in interest in population-based optimization methodologies has spawned a plethora of new and improved approaches to a wide range of engineering problems. Optimizing test suites is a combinatorial testing challenge that has been demonstrated to be an extremely difficult combinatorial optimization limitation of the research. The authors have proposed an almost infallible method for selecting combinatorial test cases. It uses a hybrid whale–gray wol
... Show MoreArtificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its ex
<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreIn the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). T
... Show MoreThis paper include the problem of segmenting an image into regions represent (objects), segment this object by define boundary between two regions using a connected component labeling. Then develop an efficient segmentation algorithm based on this method, to apply the algorithm to image segmentation using different kinds of images, this algorithm consist four steps at the first step convert the image gray level the are applied on the image, these images then in the second step convert to binary image, edge detection using Canny edge detection in third Are applie the final step is images. Best segmentation rates are (90%) obtained when using the developed algorithm compared with (77%) which are obtained using (ccl) before enhancement.