In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
In this paper, a Monte Carlo Simulation technique is used to compare the performance of the standard Bayes estimators of the reliability function of the one parameter exponential distribution .Three types of loss functions are adopted, namely, squared error loss function (SELF) ,Precautionary error loss function (PELF) andlinear exponential error loss function(LINEX) with informative and non- informative prior .The criterion integrated mean square error (IMSE) is employed to assess the performance of such estimators
In this article, a numerical method integrated with statistical data simulation technique is introduced to solve a nonlinear system of ordinary differential equations with multiple random variable coefficients. The utilization of Monte Carlo simulation with central divided difference formula of finite difference (FD) method is repeated n times to simulate values of the variable coefficients as random sampling instead being limited as real values with respect to time. The mean of the n final solutions via this integrated technique, named in short as mean Monte Carlo finite difference (MMCFD) method, represents the final solution of the system. This method is proposed for the first time to calculate the numerical solution obtained fo
... Show More
In this work, the modified Lyapunov-Schmidt reduction is used to find a nonlinear Ritz approximation of Fredholm functional defined by the nonhomogeneous Camassa-Holm equation and Benjamin-Bona-Mahony. We introduced the modified Lyapunov-Schmidt reduction for nonhomogeneous problems when the dimension of the null space is equal to two. The nonlinear Ritz approximation for the nonhomogeneous Camassa-Holm equation has been found as a function of codimension twenty-four.
The fuzzy assignment models (FAMs) have been explored by various literature to access classical values, which are more precise in our real-life accomplishment. The novelty of this paper contributed positively to a unique application of pentagonal fuzzy numbers for the evaluation of FAMs. The new method namely Pascal’s triangle graded mean (PT-GM) has presented a new algorithm in accessing the critical path to solve the assignment problems (AP) based on the fuzzy objective function of minimising total cost. The results obtained have been compared to the existing methods such as, the centroid formula (CF) and centroid formula integration (CFI). It has been demonstrated that operational efficiency of this conducted method is exquisitely deve
... Show MoreThe physical behavior for the energy distribution function (EDF) of the reactant particles depending upon the gases (fuel) temperature are completely described by a physical model covering the global formulas controlling the EDF profile. Results about the energy distribution for the reactant system indicate a standard EDF, in which it’s arrive a steady state form shape and intern lead to fix the optimum selected temperature.
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreMixture experiments are response variables based on the proportions of component for this mixture. In our research we will compare the scheffʼe model with the kronecker model for the mixture experiments, especially when the experimental area is restricted.
Because of the experience of the mixture of high correlation problem and the problem of multicollinearity between the explanatory variables, which has an effect on the calculation of the Fisher information matrix of the regression model.
to estimate the parameters of the mixture model, we used the (generalized inverse ) And the Stepwise Regression procedure
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.