The analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the purposes of assessment and estimating and fitting, this along with the use of the classical method. It was to identify the best estimation method through the use of a of comparison criteria: Root of Mean Square Error: RMSE, and the Mean Absolute Percentage Error: MAPE. Sample sizes were selected as (n = 18, 30, 50, 81) which represents the size of data generation n = 18 five-year age groups for the phenomenon being studied and the sample size n = 81 age group represents a unilateral, and replicated the experiment (500) times. The results showed the simulation that the Maximum Likelihood method is the best in the case of small and medium-sized samples where it was applied to the data for five-year age groups suffering from disturbances and confusion of Iraq Household socio-Economic survey: IHSES II2012 while entropy method outperformed in the case of large samples where it was applied to age groups monounsaturated resulting from the use of mathematical method lead to results based on the staging equation data (Formula for Interpolation) placed Sprague (Sprague) and these transactions or what is called Sprague transactions (Sprague multipliers) are used to derive the preparation of deaths and the preparation of the population by unilateral age within the age groups a five-year given the use of the death toll and the preparation of the population in this age group and its environs from a five-year categories by using Excel program where the use of age groups monounsaturated data for accuracy not detect any age is in danger of annihilation.
In this research work, a simulator with time-domain visualizers and configurable parameters using a continuous time simulation approach with Matlab R2019a is presented for modeling and investigating the performance of optical fiber and free-space quantum channels as a part of a generic quantum key distribution system simulator. The modeled optical fiber quantum channel is characterized with a maximum allowable distance of 150 km with 0.2 dB/km at =1550nm. While, at =900nm and =830nm the attenuation values are 2 dB/km and 3 dB/km respectively. The modeled free space quantum channel is characterized at 0.1 dB/km at =860 nm with maximum allowable distance of 150 km also. The simulator was investigated in terms of the execution of the BB84 prot
... Show MoreAbstract
The methods of the Principal Components and Partial Least Squares can be regard very important methods in the regression analysis, whe
... Show MoreA preventing shield for neutrons and gamma rays was designed using alternate layers of water and iron with pre-fixed dimensions in order to study the possibility of attenuating both neutrons and gamma-rays. ANISN CODE was prepared and adapted for the shield calculation using radiation doses calculation: Two groups of cross-section were used for each of neutrons and gamma-rays that rely on the one – dimensional transport equation using discrete ordinate's method, and through transforming cross-section values to values that are independent on the number of groups. The memory size required for the applied code was reduced and the results obtained were in agreement with those of standard acceptable document samples of cross –section, this a
... Show MoreThe acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.
Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks
... Show MoreAbstract
The traffic jams taking place in the cities of the Republic of Iraq in general and the province of Diwaniyah especially, causes return to the large numbers of the modern vehicles that have been imported in the last ten years and the lack of omission for old vehicles in the province, resulting in the accumulation of a large number of vehicles that exceed the capacity of the city's streets, all these reasons combined led to traffic congestion clear at the time of the beginning of work in the morning, So researchers chose local area network of the main roads of the province of Diwaniyah, which is considered the most important in terms of traffic congestion, it was identified fuzzy numbers for
... Show MoreThe problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show More
Abstract
The use of modern scientific methods and techniques, is considered important topics to solve many of the problems which face some sector, including industrial, service and health. The researcher always intends to use modern methods characterized by accuracy, clarity and speed to reach the optimal solution and be easy at the same time in terms of understanding and application.
the research presented this comparison between the two methods of solution for linear fractional programming models which are linear transformation for Charnas & Cooper , and denominator function restriction method through applied on the oil heaters and gas cookers plant , where the show after reac
... Show MoreIn the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti
... Show MoreThis research discussed, the process of comparison between the regression model of partial least squares and tree regression, where these models included two types of statistical methods represented by the first type "parameter statistics" of the partial least squares, which is adopted when the number of variables is greater than the number of observations and also when the number of observations larger than the number of variables, the second type is the "nonparametric statistic" represented by tree regression, which is the division of data in a hierarchical way. The regression models for the two models were estimated, and then the comparison between them, where the comparison between these methods was according to a Mean Square
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show More