This paper considers and proposes new estimators that depend on the sample and on prior information in the case that they either are equally or are not equally important in the model. The prior information is described as linear stochastic restrictions. We study the properties and the performances of these estimators compared to other common estimators using the mean squared error as a criterion for the goodness of fit. A numerical example and a simulation study are proposed to explain the performance of the estimators.
In this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
Background: Multiple sclerosis is a chronic autoimmune inflammatory demyelinating disease of the central nervous system of unknown etiology. Different techniques and magnetic resonance image sequences are widely used and compared to each other to improve the detection of multiple sclerosis lesions in the spinal cord. Objective: To evaluate the ability of MRI short tau inversion recovery sequences in improvementof multiple sclerosis spinal cord lesion detection when compared to T2 weighted image sequences. Type of the study: A retrospective study. Methods: this study conducted from 15thAugust 2013 to 30thJune 2014 at Baghdad teaching hospital. 22 clinically definite MS patients with clinical features suggestive of spinal cord involvement,
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreChaotic systems have been proved to be useful and effective for cryptography. Through this work, a new Feistel cipher depend upon chaos systems and Feistel network structure with dynamic secret key size according to the message size have been proposed. Compared with the classical traditional ciphers like Feistel-based structure ciphers, Data Encryption Standards (DES), is the common example of Feistel-based ciphers, the process of confusion and diffusion, will contains the dynamical permutation choice boxes, dynamical substitution choice boxes, which will be generated once and hence, considered static,
While using chaotic maps, in the suggested system, called
This study represents an attempt to develop a model that demonstrates the relationship between HRM Practices, Governmental Support and Organizational performance of small businesses. Furthermore, this study assay to unfold the socalled “Black Box” to clarify the ambiguous relationship between HRM practices and organizational performance by considering the pathway of logical sequence influence. The model of this study consists two parts, the first part devoted to examining the causal relationships among HRM practices, employees’ outcomes, and organizational performance. The second part assesses the direct relationship between the governmental support and organizational performance. It is hypothesized that HRM practices positively influ
... Show MoreIn this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show MoreRecently Genetic Algorithms (GAs) have frequently been used for optimizing the solution of estimation problems. One of the main advantages of using these techniques is that they require no knowledge or gradient information about the response surface. The poor behavior of genetic algorithms in some problems, sometimes attributed to design operators, has led to the development of other types of algorithms. One such class of these algorithms is compact Genetic Algorithm (cGA), it dramatically reduces the number of bits reqyuired to store the poulation and has a faster convergence speed. In this paper compact Genetic Algorithm is used to optimize the maximum likelihood estimator of the first order moving avergae model MA(1). Simulation results
... Show MoreOften times, especially in practical applications, it is difficult to obtain data that is not tainted by a problem that may be related to the inconsistency of the variance of error or any other problem that impedes the use of the usual methods represented by the method of the ordinary least squares (OLS), To find the capabilities of the features of the multiple linear models, This is why many statisticians resort to the use of estimates by immune methods Especially with the presence of outliers, as well as the problem of error Variance instability, Two methods of horsepower were adopted, they are the robust weighted least square(RWLS)& the two-step robust weighted least square method(TSRWLS), and their performance was verifie
... Show More