The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
summary
In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method
... Show MoreEach phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreNonlinear time series analysis is one of the most complex problems ; especially the nonlinear autoregressive with exogenous variable (NARX) .Then ; the problem of model identification and the correct orders determination considered the most important problem in the analysis of time series . In this paper , we proposed splines estimation method for model identification , then we used three criterions for the correct orders determination. Where ; proposed method used to estimate the additive splines for model identification , And the rank determination depends on the additive property to avoid the problem of curse dimensionally . The proposed method is one of the nonparametric methods , and the simulation results give a
... Show MoreThis paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
Maulticollinearity is a problem that always occurs when two or more predictor variables are correlated with each other. consist of the breach of one basic assumptions of the ordinary least squares method with biased estimates results, There are several methods which are proposed to handle this problem including the method To address a problem and method To address a problem , In this research a comparisons are employed between the biased method and unbiased method with Bayesian using Gamma distribution method addition to Ordinary Least Square metho
... Show MoreThe Purpose of this research is a comparison between two types of multivariate GARCH models BEKK and DVECH to forecast using financial time series which are the series of daily Iraqi dinar exchange rate with dollar, the global daily of Oil price with dollar and the global daily of gold price with dollar for the period from 01/01/2014 till 01/01/2016.The estimation, testing and forecasting process has been computed through the program RATS. Three time series have been transferred to the three asset returns to get the Stationarity, some tests were conducted including Ljung- Box, Multivariate Q and Multivariate ARCH to Returns Series and Residuals Series for both models with comparison between the estimation and for
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show MoreThe control charts are one of the scientific technical statistics tools that will be used to control of production and always contained from three lines central line and upper, lower lines to control quality of production and represents set of numbers so finally the operating productivity under control or nor than depending on the actual observations. Some times to calculating the control charts are not accurate and not confirming, therefore the Fuzzy Control Charts are using instead of Process Control Charts so this method is more sensitive, accurate and economically for assisting decision maker to control the operation system as early time. In this project will be used set data fr
... Show MoreThe parametric programming considered as type of sensitivity analysis. In this research concerning to study the effect of the variations on linear programming model (objective function coefficients and right hand side) on the optimal solution. To determine the parameter (θ) value (-5≤ θ ≤5).Whereas the result، the objective function equal zero and the decision variables are non basic، when the parameter (θ = -5).The objective function value increases when the parameter (θ= 5) and the decision variables are basic، with the except of X24, X34.Whenever the parameter value increase, the objectiv
... Show More