Interval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreAn experimental and theoretical study has been done to investigate the thermal performance of different types of air solar collectors, In this work air solar collector with a dimensions of (120 cm x90 cm x12 cm) , was tested under climate condition of Baghdad city with a (43° tilt angel) by using the absorber plate (1.45 mm thickness, 115 cm height x 84 cm width), which was manufactured from iron painted with a black matt.
The experimental test deals with five types of absorber:-
Conventional smooth flat plate absorber , Finned absorber , Corrugated absorber plate, Iron wire mesh on absorber And matrix of porous media on absorber .
The hourly and average efficiency of the collectors
... Show MoreA Multiple System Biometric System Based on ECG Data
In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThis article aims to explore the importance of estimating the a semiparametric regression function ,where we suggest a new estimator beside the other combined estimators and then we make a comparison among them by using simulation technique . Through the simulation results we find that the suggest estimator is the best with the first and second models ,wherealse for the third model we find Burman and Chaudhuri (B&C) is best.
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreBack ground: Several devices with different physical bases have been developed for the clinical measurement of corneal thickness, they classified into 4 categories: Scheimpflug photography based, Slit –Scanning topography, optical coherence tomography (OCT) based and ultrasound (US) based.Objective:To evaluatethe precision of the new Scheimpflug –Placido disc corneal topography in measurement of corneal thickness and to compare the measured values with that obtained by US pachymetry.Methods: Setting of this study is Lasik center in Eye Specialty Private Hospital. Baghdad. Iraq.Eyes of healthy subjects were examined with the Sirius topography.3 consecutive measurements of central (CCT)and thinnest (TCT) corneal thicknesses were obtain
... Show MoreThis study Achieved to search form the infections by Cholera and Diarrhea in two different areas from the side of Cultural, Social, Economical and Environmental field in Baghdad governorate, during a period from 3/10 – 3/12/2007.these were in Obiady city and Palestine street. This study included groups of patients who went to the Kindy Hospital lab. The researcher use a sample of (300) persons of different ages with range (150) persons in each city from the study city, in this study show a great different in the percentage of infection by parasites, helminthes, viruses, bacteria and vibrio cholera in the two city according to age groups, reach upper percentage by infection in age (1-10) years in Obiady city with percent (57.5 %) wh
... Show MoreExponential distribution is one of most common distributions in studies and scientific researches with wide application in the fields of reliability, engineering and in analyzing survival function therefore the researcher has carried on extended studies in the characteristics of this distribution.
In this research, estimation of survival function for truncated exponential distribution in the maximum likelihood methods and Bayes first and second method, least square method and Jackknife dependent in the first place on the maximum likelihood method, then on Bayes first method then comparing then using simulation, thus to accomplish this task, different size samples have been adopted by the searcher us
... Show MoreIt is not often easy to identify a certain group of words as a lexical bundle, since the same set of words can be, in different situations, recognized as idiom, a collocation, a lexical phrase or a lexical bundle. That is, there are many cases where the overlap among the four types is plausible. Thus, it is important to extract the most identifiable and distinguishable characteristics with which a certain group of words, under certain conditions, can be recognized as a lexical bundle, and this is the task of this paper.