Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received signal strength (RSS) measurements simulated using Wireless InSite (WI) software were considered in the test case study by comparing the results collected from WI with the present wireless simulated physical AP deployment of the targeted building - Computer Science Department at University of Baghdad. The performance evaluation of WOAIP shows an increase in terms of AP placement and optimization distinguished in order to increase the wireless coverage ratio to 92.93% compared to 58.5% of present AP coverage (or 24.5% coverage enhancement on average).
This study aims to analyze the flow migration of individuals between Iraqi governorates using real anonymized data from Korek Telecom company in Iraq. The purpose of this analysis is to understand the connection structure and the attractiveness of these governorates through examining the flow migration and population densities. Hence, they are classified based on the human migration at a particular period. The mobile phone data of type Call Detailed Records (CDRs) have been observed, which fall in a 6-month period during COVID-19 in the year 2020-2021. So, according to the CDRs nature, the well-known spatiotemporal algorithms: the radiation model and the gravity model were applied to analyze these data, and they are turned out to be comp
... Show MoreThe limitations of wireless sensor nodes are power, computational capabilities, and memory. This paper suggests a method to reduce the power consumption by a sensor node. This work is based on the analogy of the routing problem to distribute an electrical field in a physical media with a given density of charges. From this analogy a set of partial differential equations (Poisson's equation) is obtained. A finite difference method is utilized to solve this set numerically. Then a parallel implementation is presented. The parallel implementation is based on domain decomposition, where the original calculation domain is decomposed into several blocks, each of which given to a processing element. All nodes then execute computations in parall
... Show MoreIn this study, the activity concentrations of indoor radon, thoron
and their progeny have been measured in air for 61 different
locations of Al-Maddan city using twin cup dosimeter. Furthermore,
some useful parameters concerning the health hazards have been
estimated; working level month (WLM), annual effective dose (Eff),
and excess lung cancer per million person per year (ELC).The results
show that the values of radon gas levels in the investigated districts
varied from 56.28 to 194.43Bq/m3with an overall average value
132.96Bq/m3, while 0.313 to 1.085 for WLM with an overall average
0.740, respectively. The value of Eff and ELC have been found to
vary from 1.420 to 4.918 mSv/y with an overall average valu
Copper and Zinc powders with different particle sizes were subjected to sieving of range (20-100?m) and He-Ne laser system to determine the particle size . 1wt% from each powders was blended carefully with 99wt% from Iraqi oil . Microscopic examination were carried for all samples to reveal the particle size distribution . XRF intensity measurements were conducted for all suspended samples , and the relation between XRF intensity and the particle size was found .
This study was undertaken to diagnose routine settling problems within a third-party oil and gas companies’ Mono-Ethylene Glycol (MEG) regeneration system. Two primary issues were identified including; a) low particle size (<40 μm) resulting in poor settlement within high viscosity MEG solution and b) exposure to hydrocarbon condensate causing modification of particle surface properties through oil-wetting of the particle surface. Analysis of oil-wetted quartz and iron carbonate (FeCO₃) settlement behavior found a greater tendency to remain suspended in the solution and be removed in the rich MEG effluent stream or to strongly float and accumulate at the liquid-vapor interface in comparison to naturally water-wetted particles. As su
... Show MoreThe main intention of this study was to investigate the development of a new optimization technique based on the differential evolution (DE) algorithm, for the purpose of linear frequency modulation radar signal de-noising. As the standard DE algorithm is a fixed length optimizer, it is not suitable for solving signal de-noising problems that call for variability. A modified crossover scheme called rand-length crossover was designed to fit the proposed variable-length DE, and the new DE algorithm is referred to as the random variable-length crossover differential evolution (rvlx-DE) algorithm. The measurement results demonstrate a highly efficient capability for target detection in terms of frequency response and peak forming that was isola
... Show MoreConfocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreThe aim of this work is to design an algorithm which combines between steganography andcryptography that can hide a text in an image in a way that prevents, as much as possible, anysuspicion of the hidden textThe proposed system depends upon preparing the image data for the next step (DCT Quantization)through steganographic process and using two levels of security: the RSA algorithm and the digitalsignature, then storing the image in a JPEG format. In this case, the secret message will be looked asplaintext with digital signature while the cover is a coloured image. Then, the results of the algorithmare submitted to many criteria in order to be evaluated that prove the sufficiency of the algorithm andits activity. Thus, the proposed algorit
... Show More