The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable with given Laplace distribution.
Research deals the crises of the global recession of the facets of different and calls for the need to think out of the ordinary theory and find the arguments of the theory to accommodate the evolution of life, globalization and technological change and the standard of living of individuals and the size of the disparity in income distribution is not on the national level, but also at the global level as well, without paying attention to the potential resistance for thought the usual classical, Where the greater the returns of factors of production, the consumption will increase, and that the marginal propensity to consume may rise and the rise at rates greater with slices of low-income (the mouths of the poor) wi
... Show MoreThe aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
In this paper, we investigate the connection between the hierarchical models and the power prior distribution in quantile regression (QReg). Under specific quantile, we develop an expression for the power parameter ( ) to calibrate the power prior distribution for quantile regression to a corresponding hierarchical model. In addition, we estimate the relation between the and the quantile level via hierarchical model. Our proposed methodology is illustrated with real data example.
In this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.
Abstract:
The distribution or retention of profits is the third decision among financial management decisions in terms of priority, whether at the level of theory or practice, as the issue of distribution or retention is multi-party in terms of influence and impact, as determining the optimal percentage for each component is still the subject of intellectual debate because these decisions are linked to the future of the organization and several considerations, The research focus on the nature of the policies followed by the Iraqi banking sector As the sample chosen by the intentional sampling method was represented by the Commercial Bank of
... Show MoreThe goal of this work is demonstrating, through the gradient observation of a of type linear ( -systems), the possibility for reducing the effect of any disturbances (pollution, radiation, infection, etc.) asymptotically, by a suitable choice of related actuators of these systems. Thus, a class of ( -system) was developed based on finite time ( -system). Furthermore, definitions and some properties of this concept -system and asymptotically gradient controllable system ( -controllable) were stated and studied. More precisely, asymptotically gradient efficient actuators ensuring the weak asymptotically gradient compensation system ( -system) of known or unknown disturbances are examined. Consequently, under convenient hypo
... Show MoreThis paper develops a nonlinear transient three-dimensional heat transfer finite element model and a rate independent three-dimensional deformation model, developed for the CO2 laser welding simulations in Al-6061-T6 alloy. Simulations are performed using an indirect coupled thermal-structural method for the process of welding. Temperature-dependent thermal properties of Al-6061-T6, effect of latent heat of fusion, and the convective and radiative boundary conditions are included in the model. The heat input to the model is assumed to be a Gaussian heat source. The finite element code ANSYS12, along with a few FORTRAN subroutines, are employed to obtain the numerical results. The benefit of the proposed methodology is that it
... Show MoreSingle Point Incremental Forming (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The sheet part is locally deformed through horizontal slices. The moving locus of forming tool (called as toolpath) in these slices constructed to the finished part was performed by the CNC technology. The toolpath was created directly from CAD model of final product. The forming tool is a Ball-end forming tool, which was moved along the toolpath while the edges of sheet material were clamped rigidly on fixture.
This paper presented an investigation study of thinning distribution of a conical shapes carried out by incremental forming and the validation of finite element method to evaluate the limits of the p
... Show MoreThis paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o
... Show MoreThe purchase of a home and access to housing is one of the most important requirements for the life of the individual and the stability of living and the development of the prices of houses in general and in Baghdad in particular affected by several factors, including the basic area of the house, the age of the house, the neighborhood in which the housing is available and the basic services, Where the statistical model SSM model was used to model house prices over a period of time from 2000 to 2018 and forecast until 2025 The research is concerned with enhancing the importance of this model and describing it as a standard and important compared to the models used in the analysis of time series after obtaining the
... Show More