One of the most difficult issues in the history of communication technology is the transmission of secure images. On the internet, photos are used and shared by millions of individuals for both private and business reasons. Utilizing encryption methods to change the original image into an unintelligible or scrambled version is one way to achieve safe image transfer over the network. Cryptographic approaches based on chaotic logistic theory provide several new and promising options for developing secure Image encryption methods. The main aim of this paper is to build a secure system for encrypting gray and color images. The proposed system consists of two stages, the first stage is the encryption process, in which the keys are generated depending on the chaotic logistic with the image density to encrypt the gray and color images, and the second stage is the decryption, which is the opposite of the encryption process to obtain the original image. The proposed method has been tested on two standard gray and color images publicly available. The test results indicate to the highest value of peak signal-to-noise ratio (PSNR), unified average changing intensity (UACI), number of pixel change rate (NPCR) are 7.7268, 50.2011 and 100, respectively. While the encryption and decryption speed up to 0.6319 and 0.5305 second respectively.
Peer-Reviewed Journal
paper
Compaction of triticale grain with three moisture contents (8%, 12%, and 16% wet basis) was measured at five applied pressures (0, 7, 14, 34, and 55 kPa). Bulk density increased with increasing pressure for all moisture contents and was significantly (p < 0.0001) dependent on both moisture content and applied pressure. A Verhulst logistic equation was found to model the changes in bulk density of triticale grain with R2 of 0.986. The model showed similar beha
Earth’s climate changes rapidly due to the increases in human demands and rapid economic growth. These changes will affect the entire biosphere, mostly in negative ways. Predicting future changes will put us in a better position to minimize their catastrophic effects and to understand how humans can cope with the new changes beforehand. In this research, previous global climate data set observations from 1961-1990 have been used to predict the future climate change scenario for 2010-2039. The data were processed with Idrisi Andes software and the final Köppen-Geiger map was created with ArcGIS software. Based on Köppen climate classification, it was found that areas of Equator, Arid Steppes, and Snow will decrease by 3.9 %, 2.96%, an
... Show MoreAbstract: The aim of this study was to evaluate the effect of bone density value in Hounsfield unit derived from cone beam computed tomography (CBCT), and implant dimensions in relation to implant stability parameters namely the resonance frequency analysis and the insertion torque (IT) value. It included 24 patients who received 42 dental implants (DI). The bone density of the planned implant site was preoperatively measured using cone beam computed tomography. The implant stability was measured using Osstell implant stability quotient (ISQ). The ISQ values were recorded immediately postoperatively and after 16 weeks. The IT value was categorized as 35 N/cm or > 35 N/cm. The mean (standard deviation) primary stability was 79.58 (5.27) ISQ,
... Show MoreThe main intention of this study was to investigate the development of a new optimization technique based on the differential evolution (DE) algorithm, for the purpose of linear frequency modulation radar signal de-noising. As the standard DE algorithm is a fixed length optimizer, it is not suitable for solving signal de-noising problems that call for variability. A modified crossover scheme called rand-length crossover was designed to fit the proposed variable-length DE, and the new DE algorithm is referred to as the random variable-length crossover differential evolution (rvlx-DE) algorithm. The measurement results demonstrate a highly efficient capability for target detection in terms of frequency response and peak forming that was isola
... Show MoreOne wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
Abstract
The issue of the protection of the environment is a shared responsibility between several destinations and sectors, and constitutes a main subject in which they can achieve sustainable development. In the sectors of government programs can be set up towards the establishment of the government sector to the green environment, so to be the implementati
... Show MoreGeophysical data interpretation is crucial in characterizing the subsurface structure. The Bouguer gravity map analysis of the W-NW region of Iraq serves as the basis for the current geophysical research. The Bouguer gravity data were processed using the Power Spectrum Analysis method. Four depth slices have been acquired after the PSA process, which are: 390 m, 1300 m, 3040 m, and 12600 m depth. The gravity anomaly depth maps show that shallow-depth anomalies are mainly related to the sedimentary cover layers and structures, while the gravity anomaly of the deeper depth slice of 12600 m is more presented to the basement rocks and mantle uplift. The 2D modeling technique was used for
Gypseous soil covers approximately 30% of Iraqi lands and is widely used in geotechnical and construction engineering as it is. The demand for residential complexes has increased, so one of the significant challenges in studying gypsum soil due to its unique behavior is understanding its interaction with foundations, such as strip and square footing. This is because there is a lack of experiments that provide total displacement diagrams or failure envelopes, which are well-considered for non-problematic soil. The aim is to address a comprehensive understanding of the micromechanical properties of dry, saturated, and treated gypseous sandy soils and to analyze the interaction of strip base with this type of soil using particle image
... Show More