In this study, we investigate about the estimation improvement for Autoregressive model of the third order, by using Levinson-Durbin Recurrence (LDR) and Weighted Least Squares Error ( WLSE ).By generating time series from AR(3) model when the error term for AR(3) is normally and Non normally distributed and when the error term has ARCH(q) model with order q=1,2.We used different samples sizes and the results are obtained by using simulation. In general, we concluded that the estimation improvement for Autoregressive model for both estimation methods (LDR&WLSE), would be by increasing sample size, for all distributions which are considered for the error term , except the lognormal distribution. Also we see that the estimation improvement for WSLE method, depends on the value for the Forgetting Factor parameter (α),which haave value less than one(i.e. 1) ( α< ). The estimate is improved for large value for parameterα exactly at 0.99 α= .Finally, we used the estimation methods (LDR&WLSE) for real data.
This Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods
The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreAbstract
We produced a study in Estimation for Reliability of the Exponential distribution based on the Bayesian approach. These estimates are derived using Bayesian approaches. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .we derived bayes estimators of reliability under four types when the prior distribution for the scale parameter of the Exponential distribution is: Inverse Chi-squar
... Show MoreThe present paper agrees with estimation of scale parameter θ of the Inverted Gamma (IG) Distribution when the shape parameter α is known (α=1), bypreliminarytestsinglestage shrinkage estimators using suitable shrinkage weight factor and region. The expressions for the Bias, Mean Squared Error [MSE] for the proposed estimators are derived. Comparisons between the considered estimator with the usual estimator (MLE) and with the existing estimator are performed .The results are presented in attached tables.
This paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreA recurrent condition that affects up to 10% of people worldwide is gastric ulceration illness. The existence of gastric juice pH with the lowering of mucous defences is prerequisites for the development of chronic ulcers. The main variables affecting the mucosa susceptibility to damage include Helicobacter pylori (H. pylori) infections or non-steroidal anti-inflammatory medicines (NSAIDs). Proton pump inhibitors (PPIs) including histamine-2 (H2) receptor inhibitors, two common therapies for peptic ulcers, have been linked to side impacts, recurrence or a variety of pharmacological combinations. Conversely, therapeutic herbs or the chemicals they contain may be used to cure or eliminate a wide range of illnesses. Therefore, prominent pharma
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show More