<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most efficient and speed. An agents population is used in determining process of a required goals at search space for solving of problem. The (Dev.-PSO) algorithm is applied to different images; the number of an image which used in the experiments in this paper is three. For all used images, the Peak Signal to Noise Ratio (PSNR) value is computed. Finally, the PSNR value of the stego-A that obtained from blue sub-band colo is equal (44.87) dB, while the stego-B is equal (44.45) dB, and the PSNR value for the stego-C is (43.97)dB, while the vlue of MSE that obtained from the same color sub-bans is (0.00989), stego-B equal to (0.01869), and stego-C is (0.02041). Furthermore, our proposed method has ability to survive the quality for the stego image befor and after hiding stage or under intended attack that used in the existing paper such as Gaussian noise, and salt & pepper noise.</p>
The assessment of data quality from different sources can be considered as a key challenge in supporting effective geospatial data integration and promoting collaboration in mapping projects. This paper presents a methodology for assessing positional and shape quality for authoritative large-scale data, such as Ordnance Survey (OS) UK data and General Directorate for Survey (GDS) Iraq data, and Volunteered Geographic Information (VGI), such as OpenStreetMap (OSM) data, with the intention of assessing possible integration. It is based on the measurement of discrepancies among the datasets, addressing positional accuracy and shape fidelity, using standard procedures and also directional statistics. Line feature comparison has been und
... Show MoreThe study aims to study the geographical distribution of electricpower plants in Iraq, except the governorates of Kurdistan Region (Dohuk, Erbil, Sulaymaniyah) due to lack of data.
In order to reach the goal of the research was based on some mathematical equations and statistical methods to determine how the geographical distribution of these stations (gas, hydropower, steam, diesel) within the provinces and the concentration of them as well as the possibility of the classification of power plants in Iraq to facilitate understanding of distribution in a scientific manner is characterized by objectively.
The most important results of the research are that there are a number of factors that led to the irregular distribution
... Show More: In this study, a linear synchronous machine is compared with a linear transverse flux machine. Both machines have been designed and built with the intention of being used as the power take off in a free piston engine. As both topologies are cylindrical, it is not possible to construct either using just flat laminations and so alternative methods are described and demonstrated. Despite the difference in topology and specification, the machines are compared on a common base in terms of rated force and suitability for use as a generator. Experience gained during the manufacture of two prototypes is described.
Background: Alterations in the microhardness and roughness are commonly used to analyze the possible negative effects of bleaching products on restorative materials. This in vitro study evaluated the effect of in-office bleaching (SDI pola office +) on the surface roughness and micro-hardness of four newly developed composite materials (Z350XT –nano-filled, Z250XT-nano-hybrid, Z250-mico-hybrid and Silorane-silorane based). Materials and methods: Eighty circular samples with A3 shading were prepared by using Teflon mold 2mm thickness and 10mm in diameter. 20 samples for each material, 10 samples for base line measurement (surface roughness by using portable profillometer, and micro-hardness by usingDigital Micro Vickers Hardness Test
... Show MoreThis researchpaper includes the incorporation of Alliin at various energy levels and angles
With Metformin using Gaussian 09 and Gaussian view 06. Two computers were used in this work. Samples were generated to draw, integrate, simulate and measure the value of the potential energy surface by means of which the lowest energy value was (-1227.408au). The best correlation compound was achieved between Alliin and Metformin through the low energy values where the best place for metformin to b
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreString matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show MoreConstruction contractors usually undertake multiple construction projects simultaneously. Such a situation involves sharing different types of resources, including monetary, equipment, and manpower, which may become a major challenge in many cases. In this study, the financial aspects of working on multiple projects at a time are addressed and investigated. The study considers dealing with financial shortages by proposing a multi-project scheduling optimization model for profit maximization, while minimizing the total project duration. Optimization genetic algorithm and finance-based scheduling are used to produce feasible schedules that balance the finance of activities at any time w