In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
In this paper, estimation of system reliability of the multi-components in stress-strength model R(s,k) is considered, when the stress and strength are independent random variables and follows the Exponentiated Weibull Distribution (EWD) with known first shape parameter θ and, the second shape parameter α is unknown using different estimation methods. Comparisons among the proposed estimators through Monte Carlo simulation technique were made depend on mean squared error (MSE) criteria
In this article we study the variance estimator for the normal distribution when the mean is un known depend of the cumulative function between unbiased estimator and Bays estimator for the variance of normal distribution which is used include Double Stage Shrunken estimator to obtain higher efficiency for the variance estimator of normal distribution when the mean is unknown by using small volume equal volume of two sample .
Copper oxide thin films were deposited on glass substrate using Successive Ionic Layer Adsorption and Reaction (SILAR) method at room temperature. The thickness of the thin films was around 0.43?m.Copper oxide thin films were annealed in air at (200, 300 and 400°C for 45min.The film structure properties were characterized by x-ray diffraction (XRD). XRD patterns indicated the presence of polycrystalline CuO. The average grain size is calculated from the X-rays pattern, it is found that the grain size increased with increasing annealing temperature. Optical transmitter microscope (OTM) and atomic force microscope (AFM) was also used. Direct band gap values of 2.2 eV for an annealed sample and (2, 1.5, 1.4) eV at 200, 300,400oC respect
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show MoreThis research presents results on the full energy peak efficiency of a high purity germanium (HPGe) detector from point source as a function of photon energy and source-detector distance. The directions of photons emitted from the source and the photon path lengths in the detector were determined by Monte Carlo technique. A major advantage of this technique is the short computation time compared to the experiments. Another advantage is the flexibility for inputting detector-related parameters (such as source–detector distance, detector radius, length and attenuation coefficient) into the algorithm developed, thus making it an easy and flexible method to apply to other detector systems and configurations. It has been designed and writte
... Show MoreThe goal beyond this Research is to review methods that used to estimate Logistic distribution parameters. An exact estimators method which is the Moment method, compared with other approximate estimators obtained essentially from White approach such as: OLS, Ridge, and Adjusted Ridge as a suggested one to be applied with this distribution. The Results of all those methods are based on Simulation experiment, with different models and variety of sample sizes. The comparison had been made with respect to two criteria: Mean Square Error (MSE) and Mean Absolute Percentage Error (MAPE).
English is spoken by its native speakers in two different forms. Reduced form which marks the colloquial and rapid speech so that it is easily produced and a citation or unreduced form which is a characteristic of careful, emphasized and slow speech.
This paper investigates Iraqi EFL university students’ production of the two forms mentioned above. The sample chosen includes twenty fourth year students, of which ten are males and the other ten are females from the Department of English of the College of Languages of the University of Duhok in Kurdistan Region of Iraq in the academic year 2020-2021. The material tested is six connective words which represent the commonest ones in every-day co
... Show MoreTwo simple methods spectrophotometric were suggested for the determination of Cefixime (CFX) in pure form and pharmaceutical preparation. The first method is based without cloud point (CPE) on diazotization of the Cefixime drug by sodium nitrite at 5Cº followed by coupling with ortho nitro phenol in basic medium to form orange colour. The product was stabilized and measured 400 nm. Beer’s law was obeyed in the concentration range of (10-160) μg∙mL-1 Sandell’s sensitivity was 0.0888μg∙cm-1, the detection limit was 0.07896μg∙mL-1, and the limit of Quantitation was 0.085389μg∙mL-1.The second method was cloud point extraction (CPE) with using Trtion X-114 as surfactant. Beer
... Show MoreThe most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the
... Show MoreThe main focus of this research is to examine the Travelling Salesman Problem (TSP) and the methods used to solve this problem where this problem is considered as one of the combinatorial optimization problems which met wide publicity and attention from the researches for to it's simple formulation and important applications and engagement to the rest of combinatorial problems , which is based on finding the optimal path through known number of cities where the salesman visits each city only once before returning to the city of departure n this research , the benefits of( FMOLP) algorithm is employed as one of the best methods to solve the (TSP) problem and the application of the algorithm in conjun
... Show More