In this research, the Iraqi flagpole at Baghdad University, which is the longest in Baghdad, with a height of 75m, was monitored. According to the importance of this structure, the calculation of the displacement (vertical deviation) in the structure was monitored using the Total Station device, where several observations were taken at different times for two years the monitoring started from November 2016 until May 2017, at a rate of four observations for one year. The observation was processed using the least square method, and the fitting of circles, and then the data was processed. The deviation was calculated using the Matlab program to calculate the values of corrections, where the mathematical laws have been programmed in a format that suits the program, observations have been entered and corrections were made on them, calculating corrected values and the amount of error between the observed and calculated values. The deviation was between (0.720 to 0.759)m during the observation period. The Auto CAD program and the 3D MAX program are used to produce the two-dimensional (2D) and (3D) Models of the structure.
Iris detection is considered as challenging image processing task. In this study efficient method was suggested to detect iris and recognition it. This method depending on seed filling algorithm and circular area detection, where the color image converted to gray image, and then the gray image is converted to binary image. The seed filling is applied of the binary image and the position of detected object binary region (ROI) is localized in term of it is center coordinates are radii (i.e., the inner and out radius). To find the localization efficiency of suggested method has been used the coefficient of variation (CV) for radius iris for evaluation. The test results indicated that is suggested method is good for the iris detection.
Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreTo avoid the negative effects due to inflexibility of the domestic production inresponse to the increase in government consumption expenditure leads to more imports to meet the increase in domestic demand resulting from the increase in government consumption expenditure. Since the Iraqi economy economy yield unilateral depends on oil revenues to finance spending, and the fact government consumer spending is a progressive high flexibility the increase in overall revenues, while being a regressive flexibility is very low in the event of reduced public revenues, and therefore lead to a deficit in the current account position. And that caused the deficit for imbalance are the disruption of the
... Show MoreThe study aims to discuss the relation between imported inflation and international trade of Iraqi economy for the period (1990-2015) by using annual data. To achieve the study aim, statistical and Econometrics methods are used through NARDL model to explain non-linear relation because it’s a model assigned to measure non-linear relations and as we know most economic relations are non-linear, beside explaining positive and negative effects of imported inflation, and to reach the research aim deductive approach was adopted through using descriptive method to describe and determine phenomenon. Beside the inductive approach by g statistical and standard tools to get the standard model explains the
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreBackground: The anterior loop of mental nerve is commonly described as that part of the neurovascular bundle that transverses anterior and inferior to the mental foramen only to loop back to exit the mental foramen. The aim of the study is to evaluate the incidence and extension of anterior loop of mental nerve by using digital panoramic imaging system to avoid nerve damage during different surgical procedures in dentistry. Materials and Method: Panoramic image was taken for all 400 patients and stored in the computer. Then Horizontal and Vertical for the anterior loop extension when exist was measured and recorded in a special case sheet prepared for each subject. Results: Results indicated that out of 400 patients there were only 25 pat
... Show MoreThe problem of present study is determined by answering the following questions:
1) What is the effect of using the oral open- ended questions on Students' achievement in the third-stage of Arabic department in the college of Education? 2) What is the effect of the oral open-ended questions on developing the creative thinking of students in
... Show MoreAbstract
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on t
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show More