In this study, we investigate about the estimation improvement for Autoregressive model of the third order, by using Levinson-Durbin Recurrence (LDR) and Weighted Least Squares Error ( WLSE ).By generating time series from AR(3) model when the error term for AR(3) is normally and Non normally distributed and when the error term has ARCH(q) model with order q=1,2.We used different samples sizes and the results are obtained by using simulation. In general, we concluded that the estimation improvement for Autoregressive model for both estimation methods (LDR&WLSE), would be by increasing sample size, for all distributions which are considered for the error term , except the lognormal distribution. Also we see that the estimation improvement for WSLE method, depends on the value for the Forgetting Factor parameter (α),which haave value less than one(i.e. 1) ( α< ). The estimate is improved for large value for parameterα exactly at 0.99 α= .Finally, we used the estimation methods (LDR&WLSE) for real data.
The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreIn some cases, researchers need to know the causal effect of the treatment in order to know the extent of the effect of the treatment on the sample in order to continue to give the treatment or stop the treatment because it is of no use. The local weighted least squares method was used to estimate the parameters of the fuzzy regression discontinuous model, and the local polynomial method was used to estimate the bandwidth. Data were generated with sample sizes (75,100,125,150 ) in repetition 1000. An experiment was conducted at the Innovation Institute for remedial lessons in 2021 for 72 students participating in the institute and data collection. Those who used the treatment had an increase in their score after
... Show MoreAbstract
We produced a study in Estimation for Reliability of the Exponential distribution based on the Bayesian approach. These estimates are derived using Bayesian approaches. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .we derived bayes estimators of reliability under four types when the prior distribution for the scale parameter of the Exponential distribution is: Inverse Chi-squar
... Show MoreThe present paper agrees with estimation of scale parameter θ of the Inverted Gamma (IG) Distribution when the shape parameter α is known (α=1), bypreliminarytestsinglestage shrinkage estimators using suitable shrinkage weight factor and region. The expressions for the Bias, Mean Squared Error [MSE] for the proposed estimators are derived. Comparisons between the considered estimator with the usual estimator (MLE) and with the existing estimator are performed .The results are presented in attached tables.
The security of multimedia data becoming important spatial data of monitoring systems that contain videos prone to attack or escape via the internet, so to protect these videos used proposed method combined between encryption algorithm and sign algorithm to get on authenticated video. The proposed encryption algorithm applied to secure the video transmission by encrypt it to become unclear. This done by extract video to frames and each frame separate to three frames are Red, Green, and Blue, this frames encrypt by using three different random keys that generated by a function for generating random numbers, as for sign algorithm applied for authentication purpose that enable the receiver from sure of the identity of the sender and provide
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
This paper uses classical and shrinkage estimators to estimate the system reliability (R) in the stress-strength model when the stress and strength follow the Inverse Chen distribution (ICD). The comparisons of the proposed estimators have been presented using a simulation that depends on the mean squared error (MSE) criteria.
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show MoreThe great progress in information and communication technology has led to a huge increase in data available. Traditional systems can't keep up with this growth and can't handle this huge amount of data. Recommendation systems are one of the most important areas of research right now because they help people make decisions and find what they want among all this data. This study looked at the research trends published in Google Scholar within the period 2018-2022 related to recommending, reviewing, analysing, and comparing ebooks research papers. At first, the research papers were collected and classified based on the recommendation model used, the year of publication, and then they were compared in terms of techniques, datasets u
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show More