This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear multiplicity between most explanatory variables. These new combinations of linear compounds resulting from the two methods will reduce the number of explanatory variables to reach a new dimension one or more which called the effective dimension. The mean root of the error squares will be used to compare the two methods to show the preference of methods and a simulation study was conducted to compare the methods used. Simulation results showed that the proposed weight standard Sir method is the best.
The Adaptive Optics technique has been developed to obtain the correction of atmospheric seeing. The purpose of this study is to use the MATLAB program to investigate the performance of an AO system with the most recent AO simulation tools, Objected-Oriented Matlab Adaptive Optics (OOMAO). This was achieved by studying the variables that impact image quality correction, such as observation wavelength bands, atmospheric parameters, telescope parameters, deformable mirror parameters, wavefront sensor parameters, and noise parameters. The results presented a detailed analysis of the factors that influence the image correction process as well as the impact of the AO components on that process
The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed
Abstract:In this research we prepared nanofibers by electrospinning from poly (Vinyl Alcohol) / TiO2. The spectrum of the solution (Emission) was studied at 772 nm. Several process parameter were Investigated as concentration of PVA, the effect of distance from nozzle tip to the grounded collector (gap distance), and final the effect of high voltage. We find the optimum condition to prepare a narrow nanofibers is at concentration of PVA 16gm, the fiber has 20nm diameter.
In this research we prepared nanofibers by electrospinning
from poly (Vinyl Alcohol) / TiO2. The spectrum of the solution
(Emission) was studied at 772 nm. Several process parameter were
Investigated as concentration of PVA, the effect of distance from
nozzle tip to the grounded collector (gap distance), and final the
effect of high voltage. We find the optimum condition to prepare a
narrow nanofibers is at concentration of PVA 16gm, the fiber has
20nm diameter
Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Abstract
In this work, diabetic glucose concentration level control under disturbing meal has been controlled using two set of advanced controllers. The first set is sliding mode controllers (classical and integral) and the second set is represented by optimal LQR controllers (classical and Min-, ax). Due to their characteristic features of disturbance rejection, both integral sliding mode controller and LQR Minmax controller are dedicated here for comparison. The Bergman minimal mathematical model was used to represent the dynamic behavior of a diabetic patient’s blood glucose concentration to the insulin injection. Simulations based on Matlab/Simulink, were performed to verify the performance of each controll
... Show MoreBackground: Leishmaniasis is important public
health problem owing to its impact on morbidity
and mortality and difficulties in application of
effective control measures.
Objective: The aim of the study is to evaluate the
using of impregnate bed nets in the control of
leishmaniasis.
Methods: The study was conducted throughout
the years 2004 and 2005, in Diala Governorate
(about 60km north-east Baghdad). This is the first
study in Iraq for evaluation of the impregnated bed
net in control of leishmaniasis. Two villages were
selected to achieve this aim. The nets were
distributed for the first village to be used by their
population. The second village was served as
control.
Results: The
... Show MoreThe area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
ان العقد – في تكوينه – عبارة عن اتفاق ارادتين من اجل احداث اثر قانوني معين ، بيد ان العقد نفسه – في آثاره – يخضع لمجموع الالتزامات التي تشكلت طبقاً لما انصرفت اليه ارادة طرفيه ، فضلاً عمّا تيسّر من احكام اخرى فرضها المشرع او العرف او العدالة او مبادئ حسن النية .
Codes of red, green, and blue data (RGB) extracted from a lab-fabricated colorimeter device were used to build a proposed classifier with the objective of classifying colors of objects based on defined categories of fundamental colors. Primary, secondary, and tertiary colors namely red, green, orange, yellow, pink, purple, blue, brown, grey, white, and black, were employed in machine learning (ML) by applying an artificial neural network (ANN) algorithm using Python. The classifier, which was based on the ANN algorithm, required a definition of the mentioned eleven colors in the form of RGB codes in order to acquire the capability of classification. The software's capacity to forecast the color of the code that belongs to an ob
... Show More