This paper considers and proposes new estimators that depend on the sample and on prior information in the case that they either are equally or are not equally important in the model. The prior information is described as linear stochastic restrictions. We study the properties and the performances of these estimators compared to other common estimators using the mean squared error as a criterion for the goodness of fit. A numerical example and a simulation study are proposed to explain the performance of the estimators.
Abstract
The study discussed three areas in strategic thinking, namely, (patterns elements, outcomes) , this study aimed to measure extent to which strategic leaders have the type or types of patterns of strategic thinking, and measure the extent of their use of the elements of strategic thinking, and measurement of strategic thinking outcomes for managers at various levels , And to know the relationship between the modes of strategic thinking, elements and outcomes in organizations. the study included five banks and four hospitals and four colleges and universities, has been a research sample consisted of 168 individuals, distributed in positions (Director General , Director of Directorate , Director of
... Show MoreTheresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show MoreNew nitrone and selenonitrone compounds were synthesized. The condensation method between N-(2-hydroxyethyl) hydroxylamine and substituted carbonyl compounds such as [benzil, 4, 4́-dichlorobenzil and 2,2́ -dinitrobenzil] afforded a variety of new nitrone compounds while the condensation between N-benzylhydroxylamine and substituted selenocarbonyl compounds such as [di(4-fluorobenzoyl) diselenide and (4-chlorobenzoyl selenonitrile] obtained selenonitrone compounds. The condensation of N-4-chlorophenylhydroxylamine with dibenzoyl diselenide obtained another type of selenonitrone compounds. The structures of the synthesized compounds were assigned based on spectroscopic data (FT-IR,
... Show MoreIn this paper, a discussion of the principles of stereoscopy is presented, and the phases
of 3D image production of which is based on the Waterfall model. Also, the results are based
on one of the 3D technology which is Anaglyph and it's known to be of two colors (red and
cyan).
A 3D anaglyph image and visualization technologies will appear as a threedimensional
by using a classes (red/cyan) as considered part of other technologies used and
implemented for production of 3D videos (movies). And by using model to produce a
software to process anaglyph video, comes very important; for that, our proposed work is
implemented an anaglyph in Waterfall model to produced a 3D image which extracted from a
video.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MorePurpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Erro
... Show MoreBackground: Diabetes mellitus is a major risk factor for chronic periodontitis (CP) and hyperglycemia has an important role in the enhancement of the severity of the periodontitis. It has been reported that the progression of CP causes shifting of the balance between bone formation and resorption toward osteoclastic resorption, and this will lead to the release of collagenous bone breakdown products into the local tissues and the systemic circulation. Cross-linked N-telopeptide of type I collagen (NTx) is the amino-terminal peptides of type I collagen which is released during the process of bone resorption. This study was conducted to determine the effects of nonsurgical periodontal therapy on serum level of NTx in type 2 diabetic patients
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreSome experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.