Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received signal strength (RSS) measurements simulated using Wireless InSite (WI) software were considered in the test case study by comparing the results collected from WI with the present wireless simulated physical AP deployment of the targeted building - Computer Science Department at University of Baghdad. The performance evaluation of WOAIP shows an increase in terms of AP placement and optimization distinguished in order to increase the wireless coverage ratio to 92.93% compared to 58.5% of present AP coverage (or 24.5% coverage enhancement on average).
At the level of both individuals and companies, Wireless Sensor Networks (WSNs) get a wide range of applications and uses. Sensors are used in a wide range of industries, including agriculture, transportation, health, and many more. Many technologies, such as wireless communication protocols, the Internet of Things, cloud computing, mobile computing, and other emerging technologies, are connected to the usage of sensors. In many circumstances, this contact necessitates the transmission of crucial data, necessitating the need to protect that data from potential threats. However, as the WSN components often have constrained computation and power capabilities, protecting the communication in WSNs comes at a significant performance pena
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreThe issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of
... Show MoreA spectrophotometric determination of azithromycin was optimized using the simplex model. The approach has been proven to be accurate and sensitive. The analyte has been reacted with bromothymol blue (BTB) to form a colored ion pair which has been extracted in chloroform in a buffer medium of pH=4 of potassium phthalate. The extracted colored product was assayed at 415 nm and exhibited a linear quantification range over (1 - 20) g/ml. The excipients did not exhibit any interferences with the proposed approach for assaying azithromycin in pharmaceutical formulations.
In this work, analytical study for simulating a Fabry-Perot bistable etalon (F-P cavity) filled with a dispersive optimized nonlinear optical material (Kerr type) such as semiconductors Indium Antimonide (InSb). Because of a trade off between the etalon finesse values and driving terms, an optimization procedures have been done on the InSb etalon/CO laser parameters, using critical switching irradiance (Ic) via simulation systems of optimization procedures of optical cavity. in order to achieve the minimum switching power and faster switching time, the optimization parameters of the finesse values and driving terms on optical bistability and switching dynamics must be studied.
... Show MoreThe paper aims is to solve the problem of choosing the appropriate project from several service projects for the Iraqi Martyrs Foundation or arrange them according to the preference within the targeted criteria. this is done by using Multi-Criteria Decision Method (MCDM), which is the method of Multi-Objective Optimization by Ratios Analysis (MOORA) to measure the composite score of performance that each alternative gets and the maximum benefit accruing to the beneficiary and according to the criteria and weights that are calculated by the Analytic Hierarchy Process (AHP). The most important findings of the research and relying on expert opinion are to choose the second project as the best alternative and make an arrangement acco
... Show MoreThis paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical m
... Show More