The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the purposes of assessment and estimating and fitting, this along with the use of the classical method. It was to identify the best estimation method through the use of a of comparison criteria: Root of Mean Square Error: RMSE, and the Mean Absolute Percentage Error: MAPE. Sample sizes were selected as (n = 18, 30, 50, 81) which represents the size of data generation n = 18 five-year age groups for the phenomenon being studied and the sample size n = 81 age group represents a unilateral, and replicated the experiment (500) times. The results showed the simulation that the Maximum Likelihood method is the best in the case of small and medium-sized samples where it was applied to the data for five-year age groups suffering from disturbances and confusion of Iraq Household socio-Economic survey: IHSES II2012 while entropy method outperformed in the case of large samples where it was applied to age groups monounsaturated resulting from the use of mathematical method lead to results based on the staging equation data (Formula for Interpolation) placed Sprague (Sprague) and these transactions or what is called Sprague transactions (Sprague multipliers) are used to derive the preparation of deaths and the preparation of the population by unilateral age within the age groups a five-year given the use of the death toll and the preparation of the population in this age group and its environs from a five-year categories by using Excel program where the use of age groups monounsaturated data for accuracy not detect any age is in danger of annihilation.
The air flow pattern in a co-current pilot plant spray dryer fitted with a rotary disk atomizer was determined experimentally and modelled numerically using Computational Fluid Dynamics (CFD) (ANSYS Fluent ) software. The CFD simulation used a three dimensions system, Reynolds-Average Navier-Stokes equations (RANS), closed via the RNG k −ε turbulence model. Measurements were carried out at a rotation of the atomizer (3000 rpm) and when there is no rotation using a drying air at 25 oC and air velocity at the inlet of 5 m/s without swirl. The air flow pattern was predicted experimentally using cotton tufts and digital anemometer. The CFD simulation predicted a downward central flowing air core surrounded by a slow
... Show MoreThe primary objective of this paper, is to introduce eight types of topologies on a finite digraphs and state the implication between these topologies. Also we used supra open digraphs to introduce a new types for approximation rough digraphs.
A space X is named a πp – normal if for each closed set F and each π – closed set F’ in X with F ∩ F’ = ∅, there are p – open sets U and V of X with U ∩ V = ∅ whereas F ⊆ U and F’ ⊆ V. Our work studies and discusses a new kind of normality in generalized topological spaces. We define ϑπp – normal, ϑ–mildly normal, & ϑ–almost normal, ϑp– normal, & ϑ–mildly p–normal, & ϑ–almost p-normal and ϑπ-normal space, and we discuss some of their properties.
In this thesis, we introduce eight types of topologies on a finite digraphs and state the implication between these topologies. Also we studied some pawlak's concepts and generalization rough set theory, we introduce a new types for approximation rough digraphs depending on supra open digraphs. In addition, we present two various standpoints to define generalized membership relations, and state the implication between it, to classify the digraphs and help for measure exactness and roughness of digraphs. On the other hand, we define several kinds of fuzzy digraphs. We also introduce a topological space, which is induced by reflexive graph and tolerance graphs, such that the graph may be infinite. Furthermore, we offered some properties of th
... Show MoreThe Detour distance is one of the most common distance types used in chemistry and computer networks today. Therefore, in this paper, the detour polynomials and detour indices of vertices identified of n-graphs which are connected to themselves and separated from each other with respect to the vertices for n≥3 will be obtained. Also, polynomials detour and detour indices will be found for another graphs which have important applications in Chemistry.
Background: Measuring the concentration of hepatitis B surface antigen (HbsAg) in HBV patients can be determined with immunoassay techniques. This study aimed to measure the HbsAg titers in chronic HBV patients and to assess its correlation with patients' ages, gender, and with the levels of liver enzymes and total serum bilirubin. Materials and Method: Fifty-eight chronic hepatitis B infected patients were enrolled in this study. Age and gender of the patients were recorded. HbsAg concentration was tested with automated Immunoanalyzer. The patients were also tested for ALT, AST, ALP, and TSB by automated chemistry analyzer. Results: All the chronic HBV patients have positive HBsAg titers above the negative cutoff (0.05U/L) with mea
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreThe Adaptive Optics technique has been developed to obtain the correction of atmospheric seeing. The purpose of this study is to use the MATLAB program to investigate the performance of an AO system with the most recent AO simulation tools, Objected-Oriented Matlab Adaptive Optics (OOMAO). This was achieved by studying the variables that impact image quality correction, such as observation wavelength bands, atmospheric parameters, telescope parameters, deformable mirror parameters, wavefront sensor parameters, and noise parameters. The results presented a detailed analysis of the factors that influence the image correction process as well as the impact of the AO components on that process