There is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that it operates on a big number of key-points, the only drawback it has is that it is rather time consuming. In the suggested approach, the system deploys SIFT to perform its basic tasks of matching and description is focused on minimizing the number of key-points which is performed via applying Fast Approximate Nearest Neighbor algorithm, which will reduce the redundancy of matching leading to speeding up the process. The proposed application has been evaluated in terms of two criteria which are time and accuracy, and has accomplished a percentage of accuracy of up to 100%, in addition to speeding up the processes of matching and description.
The important factor in the success of construction projects is its ability to objective estimate of the cost of the project and adapt to the changes of the external environment, which is affected by a lot of elements and the requirements of the competitive environment. The faces of those projects are several problems in order to achieve particular goals. To overcome these difficulties has been the development of research in the last two decades and turn the focus on the role of the cost of project management, by providing information and assist management in planning and control of the budget among the main elements of the project, namely, (time-cost-quality),The research aims at the possibility of developing and implementing mechanisms
... Show MoreThere are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m
... Show MoreIn Automatic Speech Recognition (ASR) the non-linear data projection provided by a one hidden layer Multilayer Perceptron (MLP), trained to recognize phonemes, and has previous experiments to provide feature enhancement substantially increased ASR performance, especially in noise. Previous attempts to apply an analogous approach to speaker identification have not succeeded in improving performance, except by combining MLP processed features with other features. We present test results for the TIMIT database which show that the advantage of MLP preprocessing for open set speaker identification increases with the number of speakers used to train the MLP and that improved identification is obtained as this number increases beyond sixty.
... Show MoreThis research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show MoreThe logistic regression model is an important statistical model showing the relationship between the binary variable and the explanatory variables. The large number of explanations that are usually used to illustrate the response led to the emergence of the problem of linear multiplicity between the explanatory variables that make estimating the parameters of the model not accurate.
... Show MoreThis study was designed to compare the effect of two types of viral hepatitis A and E (HAV
and HEV) on liver functions in Iraqi individuals by the measurement of biochemical changes
associated with hepatitis.
The study performed on 58 HEV and 66 HAV infected patients compared with 28 healthy
subjects. The measured biochemical tests include total serum bilirubin, serum transminases (ALT
and AST) alkaline phosphatase (ALP) and gamma glutamyl transferase (GGT).
The study showed that adolescent and young adults (17-29) years, were mostly affected by
HEV while children (5-12) years were frequently affected by HAV. The severity of liver damage in
HEV patients was higher than HAV patients as a result of high serum transa
Due to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info
... Show MoreMaximizing the net present value (NPV) of oil field development is heavily dependent on optimizing well placement. The traditional approach entails the use of expert intuition to design well configurations and locations, followed by economic analysis and reservoir simulation to determine the most effective plan. However, this approach often proves inadequate due to the complexity and nonlinearity of reservoirs. In recent years, computational techniques have been developed to optimize well placement by defining decision variables (such as well coordinates), objective functions (such as NPV or cumulative oil production), and constraints. This paper presents a study on the use of genetic algorithms for well placement optimization, a ty
... Show MoreObjective(s): to assess the factors which are associated with the prolonged prehospital delay of patients with
acute myocardial infarction.
Methodology: A descriptive study was conducted at the Coronary Care unit (CCU) in Al-Yarmok Teaching
Hospital, Ibn AL-Nafis Hospital for Cardiovascular Diseases, AL-Kadumia Teaching Hospital, Baghdad Teaching
Hospital, and AL-Kindy Teaching Hospital during the period of the study from February 2
nd
, 2009 to October 30th
,
2009. A random sample of (160) paƟent who were admiƩed to the hospitals were selected one by one. A
questionnaire was constructed for the purpose of the study, which is comprised of four parts that include (1)
sociodemographic data; (2) prehospital d