احكام التركز الاقتصادي للمشاريع دراسة مقارنة
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreIn this research weights, which are used, are estimated using General Least Square Estimation to estimate simple linear regression parameters when the depended variable, which is used, consists of two classes attributes variable (for Heteroscedastic problem) depending on Sequential Bayesian Approach instead of the Classical approach used before, Bayes approach provides the mechanism of tackling observations one by one in a sequential way, i .e each new observation will add a new piece of information for estimating the parameter of probability estimation of certain phenomenon of Bernoulli trials who research the depended variable in simple regression linear equation. in addition to the information deduced from the past exper
... Show MoreAbstract
This study investigates the mechanical compression properties of tin-lead and lead-free alloy spherical balls, using more than 500 samples to identify statistical variability in the properties in each alloy. Isothermal aging was done to study and compare the aging effect on the microstructure and properties.
The results showed significant elastic and plastic anisotropy of tin phase in lead-free tin based solder and that was compared with simulation using a Crystal Plasticity Finite Element (CPEF) method that has the anisotropy of Sn installed. The results and experiments were in good agreement, indicating the range of values expected with anisotropic properties.
Keywords<
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreBackground: The gold standard in assessing asthma control is the Global Initiative for Asthma (GINA) criteria. And because of the difficulties of access to pulmonary functions tests, The ACT has the added advantage that it does not require lung function assessment.
Objectives: The aim of this study is to assess asthma control through ACT score and GINA guideline, and to determine if the ACT can be as useful as the GINA-guidelines criteria in assessing asthma control in Iraq. Patient and method: Cross sectional study with comparing ACT vs. GINA guideline in control of asthma level. This study was conducted at Respiratory consultation unit of the Iraqi National center of early detection of Cancer, Baghdad-Ira
The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn this research the Empirical Bayes method is used to Estimate the affiliation parameter in the clinical trials and then we compare this with the Moment Estimates for this parameter using Monte Carlo stimulation , we assumed that the distribution of the observation is binomial distribution while the distribution with the unknown random parameters is beta distribution ,finally we conclude that the Empirical bayes method for the random affiliation parameter is efficient using Mean Squares Error (MSE) and for different Sample size .
The search is contain compared among some order selection criteria (FPE,AIC,SBC,H-Q) for the Model first order Autoregressive when the White Noise is follow Normal distribution and some of non Gaussian distributions (Log normal, Exponential and Poisson distribution ) by using Simulation