In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
This paper is used for solving component Volterra nonlinear systems by means of the combined Sumudu transform with Adomian decomposition process. We equate the numerical results with the exact solutions to demonstrate the high accuracy of the solution results. The results show that the approach is very straightforward and effective.
Colloidal silver nanoparticles were prepared by single step green synthesis using aqueous extracts of the leaves of thyme as a function of different molar concentration of AgNO3 (1,2,3,4 mM(. The Field Emission Scanning Electron Microscopy (FESEM), UV-Visible and X-ray diffraction (XRD) were used to characterize the resultant AgNPs. The surface Plasmon resonance was observed at wavelength of 444 nm. The four intensive peaks of XRD pattern indicate the crystalline nature and the face centered cubic structure of the AgNPs. The average crystallite size of the AgNPs ranged from 18 to 22 nm. The FESEM image illustrated the well dispersion of the AgNPs and the spherical shape of the nanoparticles with a particle size distribution be
... Show MoreThe production function forms one of the techniques used in evaluation the production the process for any establishment or company, and to explain the importance of contribution of element from the independent variable and it's affect on the dependent variable. Then knowing the elements which are significant or non-significant on the dependent variable.
So the importance of this study come from estimating the Cobb-Douglas production function for Al- Mansoor General Company for Engineering industries in Iraq during the period (1989-2001)
To explain the importance which effects the independent variable such as
(N
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
In this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreIn this paper, the reliability of the stress-strength model is derived for probability P(Y<X) of a component having its strength X exposed to one independent stress Y, when X and Y are following Gompertz Fréchet distribution with unknown shape parameters and known parameters . Different methods were used to estimate reliability R and Gompertz Fréchet distribution parameters, which are maximum likelihood, least square, weighted least square, regression, and ranked set sampling. Also, a comparison of these estimators was made by a simulation study based on mean square error (MSE) criteria. The comparison confirms that the performance of the maximum likelihood estimator is better than that of the other estimators.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreJournal of Physics: Conference Series PAPER • THE FOLLOWING ARTICLE ISOPEN ACCESS Estimate the Rate of Contamination in Baghdad Soils By Using Numerical Method Luma Naji Mohammed Tawfiq1, Nadia H Al-Noor2 and Taghreed H Al-Noor1 Published under licence by IOP Publishing Ltd Journal of Physics: Conference Series, Volume 1294, Issue 3 Citation Luma Naji Mohammed Tawfiq et al 2019 J. Phys.: Conf. Ser. 1294 032020 DOI 10.1088/1742-6596/1294/3/032020 DownloadArticle PDF References Download PDF 135 Total downloads 88 total citations on Dimensions. Turn on MathJax Share this article Share this content via email Share on Facebook (opens new window) Share on Twitter (opens new window) Share on Mendeley (opens new window) Hide article and author
... Show MoreIn this article, a numerical method integrated with statistical data simulation technique is introduced to solve a nonlinear system of ordinary differential equations with multiple random variable coefficients. The utilization of Monte Carlo simulation with central divided difference formula of finite difference (FD) method is repeated n times to simulate values of the variable coefficients as random sampling instead being limited as real values with respect to time. The mean of the n final solutions via this integrated technique, named in short as mean Monte Carlo finite difference (MMCFD) method, represents the final solution of the system. This method is proposed for the first time to calculate the numerical solution obtained fo
... Show More