Deep learning has recently received a lot of attention as a feasible solution to a variety of artificial intelligence difficulties. Convolutional neural networks (CNNs) outperform other deep learning architectures in the application of object identification and recognition when compared to other machine learning methods. Speech recognition, pattern analysis, and image identification, all benefit from deep neural networks. When performing image operations on noisy images, such as fog removal or low light enhancement, image processing methods such as filtering or image enhancement are required. The study shows the effect of using Multi-scale deep learning Context Aggregation Network CAN on Bilateral Filtering Approximation (BFA) for de-noising noisy CCTV images. Data-store is used tomanage our dataset, which is an object or collection of data that are huge to enter in memory, it allows to read, manage, and process data located in multiple files as a single entity. The CAN architecture provides integral deep learning layers such as input, convolution, back normalization, and Leaky ReLu layers to construct multi-scale. It is also possible to add custom layers like adaptor normalization (µ) and adaptive normalization (Lambda) to the network. The performance of the developed CAN approximation operator on the bilateral filtering noisy image is proven when improving both the noisy reference image and a CCTV foggy image. The three image evaluation metrics (SSIM, NIQE, and PSNR) evaluate the developed CAN approximation visually and quantitatively when comparing the created de-noised image over the reference image.Compared with the input noisy image, these evaluation metrics for the developed CAN de-noised image were (0.92673/0.76253, 6.18105/12.1865, and 26.786/20.3254) respectively
The basic objective of the research is to study the quality of the water flow service in the Directorate of Karbala sewage and how to improve it after identifying the deviations of the processes and the final product and then providing the possible solutions in addressing the causes of the deviations and the associated quality gaps. A number of quality tools were used and applied to all data Stations with areas and activities related to the drainage of rainwater, as the research community determines the stations of lifting rainwater in the Directorate of the streams of Karbala holy, and the station was chosen Western station to apply the non-random sampling method intended after meeting a number of. It is one of the largest and m
... Show MoreThe aim of the current research is to construct a scale of emotional adjustment for kindergarten children and to set a standard for its evaluation. To achieve this, a scale consisting of (19) items was prepared. The mother of the child answered by adopting the method of self-report, which is expressed in the form of reporting terms, as each item represents a situation in the child's life and each situation has three alternatives to answer that represent various responses to the mentioned situation. One of the alternatives represents the emotionally adaptive response, which is given a degree (3), the second response expresses the emotional adjustment partly that took the degree of (2), and the third response expresses the weakness of emot
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreThe population has been trying to use clean energy instead of combustion. The choice was to use liquefied petroleum gas (LPG) for domestic use, especially for cooking due to its advantages as a light gas, a lower cost, and clean energy. Residential complexes are supplied with liquefied petroleum gas for each housing unit, transported by pipes from LPG tanks to the equipment. This research aims to simulate the design and performance design of the LPG system in the building that is applied to a residential complex in Baghdad taken as a study case with eight buildings. The building has 11 floors, and each floor has four apartments. The design in this study has been done in two parts, part one is the design of an LPG system for one building, an
... Show MoreThis paper presents the results of experimental investigations to predict the bearing capacity of square footing on geogrid-reinforced loose sand by performing model tests. The effects of several parameters were studied in order to study the general behavior of improving the soil by using the geogrid. These parameters include the eccentricity value, depth of first layer of reinforcement, and vertical spacing of reinforcement layers. The results of the experimental work indicated that there was an optimum reinforcement embedment depth at which the bearing capacity was the highest when single-layer reinforcement was used. The increase of (z/B) (vertical spacing of reinforcement layer/width of footing) above 1.5 has no effect on the re
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Many authors investigated the problem of the early visibility of the new crescent moon after the conjunction and proposed many criteria addressing this issue in the literature. This article presented a proposed criterion for early crescent moon sighting based on a deep-learned pattern recognizer artificial neural network (ANN) performance. Moon sight datasets were collected from various sources and used to learn the ANN. The new criterion relied on the crescent width and the arc of vision from the edge of the crescent bright limb. The result of that criterion was a control value indicating the moon's visibility condition, which separated the datasets into four regions: invisible, telescope only, probably visible, and certai
... Show MoreThis study was carried out in Baghdad (Al-Jadiriya) in 2006 by detecting ability of aquatic reed plant to remove heavy metals (Chromium) from waste water by batch process of adsorption with considering that acidic solution is best selection for such process with constant initial chromium concentration(60 mg/l),speed of shaking(300 rpm), temperature (30 Co) and constant contact time (4 h) but with different weights of adsorbent (reed) (0.5 ,1 ,2 ,3 and 4 )gm for each 100 ml volume of sample . The results showed that the percentage of the removed chromium were ( 8% ,17.5% ,31% ,40% and 50%) respectively for each sample according to the mass of adsorb
... Show MoreIn this paper, a simple fast lossless image compression method is introduced for compressing medical images, it is based on integrates multiresolution coding along with polynomial approximation of linear based to decompose image signal followed by efficient coding. The test results indicate that the suggested method can lead to promising performance due to flexibility in overcoming the limitations or restrictions of the model order length and extra overhead information required compared to traditional predictive coding techniques.
Abstract
In this work, two algorithms of Metaheuristic algorithms were hybridized. The first is Invasive Weed Optimization algorithm (IWO) it is a numerical stochastic optimization algorithm and the second is Whale Optimization Algorithm (WOA) it is an algorithm based on the intelligence of swarms and community intelligence. Invasive Weed Optimization Algorithm (IWO) is an algorithm inspired by nature and specifically from the colonizing weeds behavior of weeds, first proposed in 2006 by Mehrabian and Lucas. Due to their strength and adaptability, weeds pose a serious threat to cultivated plants, making them a threat to the cultivation process. The behavior of these weeds has been simulated and used in Invas
... Show More