In this paper,we estimate the parameters and related probability functions, survival function, cumulative distribution function , hazard function(failure rate) and failure (death) probability function(pdf) for two parameters Birnbaum-Saunders distribution which is fitting the complete data for the patients of lymph glands cancer. Estimating the parameters (shape and scale) using (maximum likelihood , regression quantile and shrinkage) methods and then compute the value of mentioned related probability functions depending on sample from real data which describe the duration of survivor for patients who suffer from the lymph glands cancer based on diagnosis of disease or the inter of patients in a hospital for perio
... Show MoreThe present work aims to achieve pulsed laser deposition ofTiO2 nanostructures and investigate their nonlinear properties using z-scan technique.The second harmonic Q-switched Nd: YAG laser at repetition rate of 1Hz and wavelength of 532 nm with three different laser fluencies in the range of 0.77-1.1 J/cm2 was utilized to irradiate the TiO2 target. The products of laser-induced plasma were characterized by utilizing UV-Vis absorption spectroscopy, x-ray diffraction (XRD), atomic force Microscope (AFM),and Fourier transform infrared (FTIR). A reasonable agreement was found among the data obtained usingX-Ray diffraction, UV-Vis and Raman spectroscopy. The XRD results showed that the prepared TiO2
... Show MoreThis paper concerns with deriving and estimating the reliability of the multicomponent system in stress-strength model R(s,k), when the stress and strength are identical independent distribution (iid), follows two parameters Exponentiated Pareto Distribution(EPD) with the unknown shape and known scale parameters. Shrinkage estimation method including Maximum likelihood estimator (MLE), has been considered. Comparisons among the proposed estimators were made depending on simulation based on mean squared error (MSE) criteria.
The use of Bayesian approach has the promise of features indicative of regression analysis model classification tree to take advantage of the above information by, and ensemble trees for explanatory variables are all together and at every stage on the other. In addition to obtaining the subsequent information at each node in the construction of these classification tree. Although bayesian estimates is generally accurate, but it seems that the logistic model is still a good competitor in the field of binary responses through its flexibility and mathematical representation. So is the use of three research methods data processing is carried out, namely: logistic model, and model classification regression tree, and bayesian regression tree mode
... Show MoreUtilizing the Turbo C programming language, the atmospheric earth model is created from sea level to 86 km. This model has been used to determine atmospheric Earth parameters in this study. Analytical derivations of these parameters are made using the balancing forces theory and the hydrostatic equation. The effects of altitude on density, pressure, temperature, gravitational acceleration, sound speed, scale height, and molecular weight are examined. The mass of the atmosphere is equal to about 50% between sea level and 5.5 km. g is equal to 9.65 m/s2 at 50 km altitude, which is 9% lower than 9.8 m/s2 at sea level. However, at 86 km altitude, g is close to 9.51 m/s2, which is close to 15% smaller
... Show MoreUtilizing the Turbo C programming language, the atmospheric earth model is created from sea level to 86 km. This model has been used to determine atmospheric Earth parameters in this study. Analytical derivations of these parameters are made using the balancing forces theory and the hydrostatic equation. The effects of altitude on density, pressure, temperature, gravitational acceleration, sound speed, scale height, and molecular weight are examined. The mass of the atmosphere is equal to about 50% between sea level and 5.5 km. g is equal to 9.65 m/s2 at 50 km altitude, which is 9% lower than 9.8 m/s2 at sea level. However, at 86 km altitude, g is close to 9.51 m/s2, which is close to 15% smaller than 9.8 m/s2. These resu
... Show MoreApplications of quantitative methods, which had been explicit attention during previous period (the last two centuries) is the method of application sales man or traveling salesman method. According to this interest by the actual need for a lot of the production sectors and companies that distribute their products, whether locally made or the imported for customers or other industry sectors where most of the productive sectors and companies distributed always aspired to (increase profits, imports, the production quantity, quantity of exports. etc. ...) this is the part of the other hand, want to behave during the process of distribution routes that achieve the best or the least or most appropriate.
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
The main purpose of the work is to apply a new method, so-called LTAM, which couples the Tamimi and Ansari iterative method (TAM) with the Laplace transform (LT). This method involves solving a problem of non-fatal disease spread in a society that is assumed to have a fixed size during the epidemic period. We apply the method to give an approximate analytic solution to the nonlinear system of the intended model. Moreover, the absolute error resulting from the numerical solutions and the ten iterations of LTAM approximations of the epidemic model, along with the maximum error remainder, were calculated by using MATHEMATICA® 11.3 program to illustrate the effectiveness of the method.
The convolutional neural networks (CNN) are among the most utilized neural networks in various applications, including deep learning. In recent years, the continuing extension of CNN into increasingly complicated domains has made its training process more difficult. Thus, researchers adopted optimized hybrid algorithms to address this problem. In this work, a novel chaotic black hole algorithm-based approach was created for the training of CNN to optimize its performance via avoidance of entrapment in the local minima. The logistic chaotic map was used to initialize the population instead of using the uniform distribution. The proposed training algorithm was developed based on a specific benchmark problem for optical character recog
... Show More