In this research we assumed that the number of emissions by time (𝑡) of radiation particles is distributed poisson distribution with parameter (𝑡), where < 0 is the intensity of radiation. We conclude that the time of the first emission is distributed exponentially with parameter 𝜃, while the time of the k-th emission (𝑘 = 2,3,4, … . . ) is gamma distributed with parameters (𝑘, 𝜃), we used a real data to show that the Bayes estimator 𝜃 ∗ for 𝜃 is more efficient than 𝜃̂, the maximum likelihood estimator for 𝜃 by using the derived variances of both estimators as a statistical indicator for efficiency
The modern teaching methods, and their importance in achieving the desired learning goals for the individual and the society, have been addressed, as it is necessary to develop the methods, ways and strategies used in the process of teaching the intermediate stages in the various fields in general and the field of physical education in particular, the importance of research is the effect of using the strategy of similarities in teaching some basic skills of basketball for students of the second intermediate. As for the problem of research, the researcher mentioned the lack of use of teachers’ strategy method similarities in the educational units because of its importance, and after study and analysis the researcher found it necessary to i
... Show MoreThis research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
The researchers have a special interest in studying Markov chains as one of the probability samples which has many applications in different fields. This study comes to deal with the changes issue that happen on budget expenditures by using statistical methods, and Markov chains is the best expression about that as they are regarded reliable samples in the prediction process. A transitional matrix is built for three expenditure cases (increase ,decrease ,stability) for one of budget expenditure items (base salary) for three directorates (Baghdad ,Nineveh , Diyala) of one of the ministries. Results are analyzed by applying Maximum likelihood estimation and Ordinary least squares methods resulting
... Show MoreThe research studied and analyzed the hybrid parallel-series systems of asymmetrical components by applying different experiments of simulations used to estimate the reliability function of those systems through the use of the maximum likelihood method as well as the Bayes standard method via both symmetrical and asymmetrical loss functions following Rayleigh distribution and Informative Prior distribution. The simulation experiments included different sizes of samples and default parameters which were then compared with one another depending on Square Error averages. Following that was the application of Bayes standard method by the Entropy Loss function that proved successful throughout the experimental side in finding the reliability fun
... Show MoreBreast cancer has got much attention in the recent years as it is a one of the complex diseases that can threaten people lives. It can be determined from the levels of secreted proteins in the blood. In this project, we developed a method of finding a threshold to classify the probability of being affected by it in a population based on the levels of the related proteins in relatively small case-control samples. We applied our method to simulated and real data. The results showed that the method we used was accurate in estimating the probability of being diseased in both simulation and real data. Moreover, we were able to calculate the sensitivity and specificity under the null hypothesis of our research question of being diseased o
... Show MoreThis research aims to solve the problem of selection using clustering algorithm, in this research optimal portfolio is formation using the single index model, and the real data are consisting from the stocks Iraqi Stock Exchange in the period 1/1/2007 to 31/12/2019. because the data series have missing values ,we used the two-stage missing value compensation method, the knowledge gap was inability the portfolio models to reduce The estimation error , inaccuracy of the cut-off rate and the Treynor ratio combine stocks into the portfolio that caused to decline in their performance, all these problems required employing clustering technic to data mining and regrouping it within clusters with similar characteristics to outperform the portfolio
... Show MoreThis research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show MoreSurvival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show Moreيعد الهواء من الضروريات لعيش الانسان وكلما كان الهواء نقياً كلما كانت صحة الانسان جيدة، وفي الآونة الاخيرة اصبح الهواء يتأثر بالملوثات وهنالك عدة متغيرات تؤثر في نقاوة الهواء واصبح الهواء يؤثر على صحة الانسان لما يحمله من ملوثات توثر على جسم الانسان وصحته. ولمعرفة مدى تلوث الهواء في كل مناطق الارض تم بناء مؤشر لمعرف تلوث الهواء بالاعتماد على عدة متغيرات يسمى بمعامل تأثير الهواء. ولبيان اكثر المتغيرات تأ
... Show Moreان تحليل البقاء هو عبارة عن تحليل البيانات التي تكون في شكل اوقات من اصل الوقت حتى حدوث حدث النهاية ، وفي البحوث الطبية يكون اصل الوقت هو تاريخ تسجيل المفردة او المريض في دراسة ما مثل التجارب السريرية لمقارنة نوعين من الدواء او اكثر اذا كانت نقطة النهاية هي وفاة المريض او اختفاء المفردة فالبيانات الناتجة من هذه العملية تسمى اوقات البقاء اما اذا كانت النهاية هي ليست الوفاة فالبيانات الناتجة تسمى بيانات الوقت ح
... Show MoreRegression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as t
... Show Moreيتكون الانحدار المقسم من عدة أقسام تفصل بينها نقاط انتماء مختلفة، فتظهر حالة عدم التجانس الناشئة من عملية فصل الأقسام ضمن عينة البحث. ويهتم هذا البحث في تقدير موقع نقطة التغيير بين الأقسام وتقدير معلمات الأنموذج، واقتراح طريقة تقدير حصينة ومقارنتها مع بعض الطرائق المستعملة في الانحدار الخطي المقسم. وقد تم استعمال أحد الطرائق التقليدية (طريقة Muggeo) لإيجاد مقدرات الإمكان الأعظم بالأسلوب الت
... Show MoreCopula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreMultilocus haplotype analysis of candidate variants with genome wide association studies (GWAS) data may provide evidence of association with disease, even when the individual loci themselves do not. Unfortunately, when a large number of candidate variants are investigated, identifying risk haplotypes can be very difficult. To meet the challenge, a number of approaches have been put forward in recent years. However, most of them are not directly linked to the disease-penetrances of haplotypes and thus may not be efficient. To fill this gap, we propose a mixture model-based approach for detecting risk haplotypes. Under the mixture model, haplotypes are clustered directly according to their estimated d
In this paper new methods were presented based on technique of differences which is the difference- based modified jackknifed generalized ridge regression estimator(DMJGR) and difference-based generalized jackknifed ridge regression estimator(DGJR), in estimating the parameters of linear part of the partially linear model. As for the nonlinear part represented by the nonparametric function, it was estimated using Nadaraya Watson smoother. The partially linear model was compared using these proposed methods with other estimators based on differencing technique through the MSE comparison criterion in simulation study.
The primary aim of this study was to identify the effect of using the simultaneous electronic presentations strategy in teaching basic skills of basketball to second-grade intermediate students. The present study had a parallel group, pre-post experimental design. In the present study the students of the Salah al-Din Intermediate School for the academic year 2020-2021 constituted the research community. A total of 75 students were present in the research community. Out of 75 students 16 students were selected as the participants for the study. The students falling within the age group of 13-14 years were recruited as the study participants, making up a percentage of 21.33 of the total number. Based on the results of th
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreIn this research was to use the method of classic dynamic programming (CDP) and the method of fuzzy dynamic programming (FDP) to controlling the inventory in N periods and only one substance ,in order to minimize the total cost and determining the required quantity in warehouse rusafa principal of the ministry of commerce . A comparison was made between the two techniques، We found that the value of fuzzy total cost is less than that the value of classic total cost
The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreTheresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show MoreGrey system theory is a multidisciplinary scientific approach, which deals with systems that have partially unknown information (small sample and uncertain information). Grey modeling as an important component of such theory gives successful results with limited amount of data. Grey Models are divided into two types; univariate and multivariate grey models. The univariate grey model with one order derivative equation GM (1,1) is the base stone of the theory, it is considered the time series prediction model but it doesn’t take the relative factors in account. The traditional multivariate grey models GM(1,M) takes those factor in account but it has a complex structure and some defects in " modeling mechanism", "parameter estimation "and "m
... Show More