The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable with given Laplace distribution.
Arabic calligraphy has great importance in the printing designs that are often on the written character-based to perform functional goals and to achieve some of the aesthetic values in design work, has led major developments in the field of computer manufacturing and design software for the design and layout to increase to deal with the programmed character for the purposes of typesetting has confronted this Find the Arab character programmed for the purposes of typesetting for the detection of the most important design interventions that underwent the Arabic letter written to turn it into a programmed image intended for the purposes of the printing and typesetting has been addressed in the context of which the theoretical types of
... Show MoreBeta Distribution
Abstract
Gamma and Beta Distributions has very important in practice in various areas of statistical and applications reliability and quality control of production. and There are a number of methods to generate data behave on according to these distribution. and These methods bassic primarily on the shape parameters of each distribution and the relationship between these distributions and their relationship with some other probability distributions. &nb
... Show MoreIn this research, the one of the most important model and widely used in many and applications is linear mixed model, which widely used to analysis the longitudinal data that characterized by the repeated measures form .where estimating linear mixed model by using two methods (parametric and nonparametric) and used to estimate the conditional mean and marginal mean in linear mixed model ,A comparison between number of models is made to get the best model that will represent the mean wind speed in Iraq.The application is concerned with 8 meteorological stations in Iraq that we selected randomly and then we take a monthly data about wind speed over ten years Then average it over each month in corresponding year, so we g
... Show MoreIn this paper, the Normality set will be investigated. Then, the study highlights some concepts properties and important results. In addition, it will prove that every operator with normality set has non trivial invariant subspace of .
The exploitation of all available resources and benefiting from them is one of the most important problems facing the decision makers at the present time. In order to exploit these resources, it is necessary to organize the conflicting objectives, which is the main work in the project management, which enables the development of a plan that decision makers can use to shorten the total completion time and reduce the total cost of the project. Through the use of modern scientific techniques, and therefore the researcher using the critical path method using the technology of programming goals to find more efficient ways to make appropriate decisions where the researcher worked to solve the problems in the construction of the Departm
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreIn this paper, a new technique is offered for solving three types of linear integral equations of the 2nd kind including Volterra-Fredholm integral equations (LVFIE) (as a general case), Volterra integral equations (LVIE) and Fredholm integral equations (LFIE) (as special cases). The new technique depends on approximating the solution to a polynomial of degree and therefore reducing the problem to a linear programming problem(LPP), which will be solved to find the approximate solution of LVFIE. Moreover, quadrature methods including trapezoidal rule (TR), Simpson 1/3 rule (SR), Boole rule (BR), and Romberg integration formula (RI) are used to approximate the integrals that exist in LVFIE. Also, a comparison between those methods i
... Show MoreThis paper develops a fuzzy multi-objective model for solving aggregate production planning problems that contain multiple products and multiple periods in uncertain environments. We seek to minimize total production cost and total labor cost. We adopted a new method that utilizes a Zimmermans approach to determine the tolerance and aspiration levels. The actual performance of an industrial company was used to prove the feasibility of the proposed model. The proposed model shows that the method is useful, generalizable, and can be applied to APP problems with other parameters.
Linear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show More