Marshlands environment in southern Iraq is unique and is considered a habitat of thousands of migratory birds as shelter and a source of livelihood for thousands of people living there. Its environment is characterized by a fragile ecosystem that requires great care and effort to achieve the greatest possible balance and parallelism of development, which necessarily require careful environmental planning that accurately regulates the resources of the environment and therefore, planned the best way to use them. The idea of research for creating the spatial organization of the development of the human settlements and taking into account the environmental aspect by thinking for the planning of development, creating environmental vision for the present and a future that will contribute to organization of human settlements and sustainable spatial development. For achieving that, the research uses Environmental Impact Assessment (EIA) and the Geographic Information Systems (GIS) as techniques for analyzing and choosing the appropriate zones for development and environmental planning. The research concluded to suggest the environmental sensitivity zones, and the appropriate projects that can be established in these zones as well as evaluated with EIA approaches, and the most important recommendation is the need to commitment by the environmental sensitivity zones map for appropriate development.
In modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show MoreThe esterification reaction of ethyl alcohol and acetic acid catalyzed by the ion exchange resin, Amberlyst 15, was investigated. The experimental study was implemented in an isothermal batch reactor. Catalyst loading, initial molar ratio, mixing time and temperature as being the most effective parameters, were extensively studied and discussed. A maximum final conversion of 75% was obtained at 70°C, acid to ethyl alcohol mole ratio of 1/2 and 10 g catalyst loading. Kinetic of the reaction was correlated with Langmuir-Hanshelwood model (LHM). The total rate constant and the adsorption equilibrium of water as a function of the temperature was calculated. The activation energies were found to be as 113876.9 and -49474.95 KJ per Kmol of ac
... Show MoreData mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the most effective parameter, particularly when Age<49.5. Whereas Ki67 appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu
... Show MoreIn digital images, protecting sensitive visual information against unauthorized access is considered a critical issue; robust encryption methods are the best solution to preserve such information. This paper introduces a model designed to enhance the performance of the Tiny Encryption Algorithm (TEA) in encrypting images. Two approaches have been suggested for the image cipher process as a preprocessing step before applying the Tiny Encryption Algorithm (TEA). The step mentioned earlier aims to de-correlate and weaken adjacent pixel values as a preparation process before the encryption process. The first approach suggests an Affine transformation for image encryption at two layers, utilizing two different key sets for each layer. Th
... Show MoreAssessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreComputer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreAbstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show MoreAbstract
Due to the continuing demand for larger bandwidth, the optical transport becoming general in the access network. Using optical fiber technologies, the communications infrastructure becomes powerful, providing very high speeds to transfer a high capacity of data. Existing telecommunications infrastructures is currently widely used Passive Optical Network that apply Wavelength Division Multiplexing (WDM) and is awaited to play an important role in the future Internet supporting a large diversity of services and next generation networks. This paper presents a design of WDM-PON network, the simulation and analysis of transmission parameters in the Optisystem 7.0 environment for bidirectional traffic. The sim
... Show MoreThe Gaussian orthogonal ensemble (GOE) version of the random matrix theory (RMT) has been used to study the level density following up the proton interaction with 44Ca, 48Ti and 56Fe.
A promising analysis method has been implemented based on the available data of the resonance spacing, where widths are associated with Porter Thomas distribution. The calculated level density for the compound nuclei 45Sc,49Vand 57Co shows a parity and spin dependence, where for Sc a discrepancy in level density distinguished from this analysis probably due to the spin misassignment .The present results show an acceptable agreement with the combinatorial method of level density.
... Show More