Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received signal strength (RSS) measurements simulated using Wireless InSite (WI) software were considered in the test case study by comparing the results collected from WI with the present wireless simulated physical AP deployment of the targeted building - Computer Science Department at University of Baghdad. The performance evaluation of WOAIP shows an increase in terms of AP placement and optimization distinguished in order to increase the wireless coverage ratio to 92.93% compared to 58.5% of present AP coverage (or 24.5% coverage enhancement on average).
Four rapid, accurate and very simple derivative spectrophotometric techniques were developed for the quantitative determination of binary mixtures of estradiol (E2) and progesterone (PRG) formulated as a capsule. Method I is the first derivative zero-crossing technique, derivative amplitudes were detected at the zero-crossing wavelength of 239.27 and 292.51 nm for the quantification of estradiol and 249.19 nm for Progesterone. Method II is ratio subtraction, progesterone was determined at λmax 240 nm after subtraction of interference exerted by estradiol. Method III is modified amplitude subtraction, which was established using derivative spectroscopy and mathematical manipulations. Method IIII is the absorbance ratio technique, absorba
... Show MoreFlexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreOne of the biomedical image problems is the appearance of the bubbles in the slide that could occur when air passes through the slide during the preparation process. These bubbles may complicate the process of analysing the histopathological images. The objective of this study is to remove the bubble noise from the histopathology images, and then predict the tissues that underlie it using the fuzzy controller in cases of remote pathological diagnosis. Fuzzy logic uses the linguistic definition to recognize the relationship between the input and the activity, rather than using difficult numerical equation. Mainly there are five parts, starting with accepting the image, passing through removing the bubbles, and ending with predict the tissues
... Show MoreCooperation spectrum sensing in cognitive radio networks has an analogy to a distributed decision in wireless sensor networks, where each sensor make local decision and those decision result are reported to a fusion center to give the final decision according to some fusion rules. In this paper the performance of cooperative spectrum sensing examines using new optimization strategy to find optimal weight and threshold curves that enables each secondary user senses the spectrum environment independently according to a floating threshold with respect to his local environment. Our proposed approach depends on proving the convexity of the famous optimization problem in cooperative spectrum sensing that stated maximizing the probability of detec
... Show MoreIncreased downscaling of CMOS circuits with respect to feature size and threshold voltage has a result of dramatically increasing in leakage current. So, leakage power reduction is an important design issue for active and standby modes as long as the technology scaling increased. In this paper, a simultaneous active and standby energy optimization methodology is proposed for 22 nm sub-threshold CMOS circuits. In the first phase, we investigate the dual threshold voltage design for active energy per cycle minimization. A slack based genetic algorithm is proposed to find the optimal reverse body bias assignment to set of noncritical paths gates to ensure low active energy per cycle with the maximum allowable frequency at the optimal supply vo
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show MoreThe application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show MoreThe function of internal auditing has become an important function that aims at achieving objectives that are compatible with these developments and changes that have occurred in many countries in the world, which required the emergence of professional associations and institutes in the profession of internal auditing. Improve the guideline of the internal audit units issued by the Federal Audit Bureau to enhance the efficiency of internal audit performance in Iraqi government units. The researchers adopted the statistical method of proving the hypothesis by constructing a questionnaire that included three main axes: supporting the senior management in adopting the current guide, and the second being the importance of improving t
... Show MoreThis paper attempted to study the effect of cutting parameters (spindle speed and feed rate) on delamination phenomena during the drilling glass-polyester composites. Drilling process was done by CNC machine with 10 mm diameter of high-speed steel (HSS) drill bit. Taguchi technique with L16 orthogonal layout was used to analyze the effective parameters on delamination factor. The optimal experiment was no. 13 with spindle speed 1273 rpm and feed 0.05 mm/rev with minimum delamination factor 1.28. &
... Show More