This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear multiplicity between most explanatory variables. These new combinations of linear compounds resulting from the two methods will reduce the number of explanatory variables to reach a new dimension one or more which called the effective dimension. The mean root of the error squares will be used to compare the two methods to show the preference of methods and a simulation study was conducted to compare the methods used. Simulation results showed that the proposed weight standard Sir method is the best.
This paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
The concept of training is no longer traditionally understood Limited organize traditional training courses, but has become a strategic choice in the investment and development of human resources system, attic trying to find the answer to the core problem of the study which
is the extent to which the training process, the traditional form that meets the needs of the company the development of intellectual capital.This research aimstostatementof the impact dimensions the training process(training role, support or top management , training programs, modern technology)of the in components Intellectual Capital(Human Capital, Structural Capital, Customer Capital) and provide the top management of the Company for the development of sci
... Show MoreAbstract
The prevention of bankruptcy not only prolongs the economic life of the company and increases its financial performance, but also helps to improve the general economic well-being of the country. Therefore, forecasting the financial shortfall can affect various factors and affect different aspects of the company, including dividends. In this regard, this study examines the prediction of the financial deficit of companies that use the logistic regression method and its impact on the earnings per share of companies listed on the Iraqi Stock Exchange. The time period of the research is from 2015 to 2020, where 33 companies that were accepted in the Iraqi Stock Exchange were selected as a sample, and the res
... Show MoreThis paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
This research aims to predict the value of the maximum daily loss that the fixed-return securities portfolio may suffer in Qatar National Bank - Syria, and for this purpose data were collected for risk factors that affect the value of the portfolio represented by the time structure of interest rates in the United States of America over the extended period Between 2017 and 2018, in addition to data related to the composition of the bonds portfolio of Qatar National Bank of Syria in 2017, And then employing Monte Carlo simulation models to predict the maximum loss that may be exposed to this portfolio in the future. The results of the Monte Carlo simulation showed the possibility of decreasing the value at risk in the future due to the dec
... Show MoreThe estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show MoreThe technology of reducing dimensions and choosing variables are very important topics in statistical analysis to multivariate. When two or more of the predictor variables are linked in the complete or incomplete regression relationships, a problem of multicollinearity are occurred which consist of the breach of one basic assumptions of the ordinary least squares method with incorrect estimates results.
There are several methods proposed to address this problem, including the partial least squares (PLS), used to reduce dimensional regression analysis. By using linear transformations that convert a set of variables associated with a high link to a set of new independent variables and unr
... Show MoreThe aim of this research is to diagnose the impact of competitive dimensions represented by quality, cost, time, flexibility on the efficiency of e-learning, The research adopted the descriptive analytical method by identifying the impact of these dimensions on the efficiency of e-learning, as well as the use of the statistical method for the purpose of eliciting results. The research concluded that there is an impact of the competitive dimensions on the efficiency of e-learning, as it has been proven that the special models for each of the research hypotheses are statistically significant and at a level of significance of 5%, and that each of these dimensions has a positive impact on the dependent variable, and the research recommended
... Show More