The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
Building a system to identify individuals through their speech recording can find its application in diverse areas, such as telephone shopping, voice mail and security control. However, building such systems is a tricky task because of the vast range of differences in the human voice. Thus, selecting strong features becomes very crucial for the recognition system. Therefore, a speaker recognition system based on new spin-image descriptors (SISR) is proposed in this paper. In the proposed system, circular windows (spins) are extracted from the frequency domain of the spectrogram image of the sound, and then a run length matrix is built for each spin, to work as a base for feature extraction tasks. Five different descriptors are generated fro
... Show MoreStructural and optical properties were studied as a function of Nano membrane after prepared, for tests. Nano membrane was deposited by the spray coating method on substrates (glass) of thickness 100 mm. The X-ray diffraction spectra of (CNTs, WO3) were studied. AFM tests are good information about the roughness, It had been designed electrolysis cell and fuel cell. Studies have been performed on electrochemical parameters.
The development of wireless sensor networks (WSNs) in the underwater environment leads to underwater WSN (UWSN). It has severe impact over the research field due to its extensive and real-time applications. However effective execution of underwater WSNs undergoes several problems. The main concern in the UWSN is sensor nodes’ energy depletion issue. Energy saving and maintaining quality of service (QoS) becomes highly essential for UWASN because of necessity of QoS application and confined sensor nodes (SNs). To overcome this problem, numerous prevailing methods like adaptive data forwarding techniques, QoS-based congestion control approaches, and various methods have been devised with maximum throughput and minimum network lifesp
... Show MoreThe development of wireless sensor networks (WSNs) in the underwater environment leads to underwater WSN (UWSN). It has severe impact over the research field due to its extensive and real-time applications. However effective execution of underwater WSNs undergoes several problems. The main concern in the UWSN is sensor nodes’ energy depletion issue. Energy saving and maintaining quality of service (QoS) becomes highly essential for UWASN because of necessity of QoS application and confined sensor nodes (SNs). To overcome this problem, numerous prevailing methods like adaptive data forwarding techniques, QoS-based congestion control approaches, and various methods have been devised with maximum throughput and minimum network lifesp
... Show MoreThe usage of remote sensing techniques in managing and monitoring the environmental areas is increasing due to the improvement of the sensors used in the observation satellites around the earth. Resolution merge process is used to combine high resolution one band image with another one that have low resolution multi bands image to produce one image that is high in both spatial and spectral resolution. In this work different merging methods were tested to evaluate their enhancement capabilities to extract different environmental areas; Principle component analysis (PCA), Brovey, modified (Intensity, Hue ,Saturation) method and High Pass Filter methods were tested and subjected to visual and statistical comparison for evaluation. Both visu
... Show MoreBackground: Symptoms related to the upper gastro-intestinal tract are very common. Attribution of these symptoms to upper G. I. T.diseases are usually done on clinical bases, which could be confirmed by Esophago Gastro Duodenoscopy (EGD). The use of such tools might increase the diagnosis accuracy for such complaints. The indications for upper G I endoscopy might decrease the negative results of endoscopies.Objective: To follow strict indications for Esophago Gastro Duodenoscopy in order to decrease the negative endoscopy results. Methods: One thousand eight hundred and ninety cases were subjected to EGD from Feb. 1999 to Feb 2009 at Alkindy Teaching Hospital and Abd-Al-Majeed private hospital in Baghdad, Iraq. A special endoscopy unit f
... Show MoreA new approach presented in this study to determine the optimal edge detection threshold value. This approach is base on extracting small homogenous blocks from unequal mean targets. Then, from these blocks we generate small image with known edges (edges represent the lines between the contacted blocks). So, these simulated edges can be assumed as true edges .The true simulated edges, compared with the detected edges in the small generated image is done by using different thresholding values. The comparison based on computing mean square errors between the simulated edge image and the produced edge image from edge detector methods. The mean square error computed for the total edge image (Er), for edge regio
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreIn this study, the performance of the adaptive optics (AO) system was analyzed through a numerical computer simulation implemented in MATLAB. Making a phase screen involved turning computer-generated random numbers into two-dimensional arrays of phase values on a sample point grid with matching statistics. Von Karman turbulence was created depending on the power spectral density. Several simulated point spread functions (PSFs) and modulation transfer functions (MTFs) for different values of the Fried coherent diameter (ro) were used to show how rough the atmosphere was. To evaluate the effectiveness of the optical system (telescope), the Strehl ratio (S) was computed. The compensation procedure for an AO syst
... Show More