The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable with given Laplace distribution.
This article aims to estimate the partially linear model by using two methods, which are the Wavelet and Kernel Smoothers. Simulation experiments are used to study the small sample behavior depending on different functions, sample sizes, and variances. Results explained that the wavelet smoother is the best depending on the mean average squares error criterion for all cases that used.
The design of components subjected to contact stress as local compressive stress is important in engineering application especially in ball and socket Joining. Two kinds of contact stress are introduced in the ball and socket joint, the first is from normal contact while the other is from sliding contact. Although joining two long links (drive shaft in steering cars) will cause the effect of flexural and tensional buckling stress in hollow columns through the ball and socket ends on the failure condition of the joining mechanism. In this paper the consideration of the combined effect of buckling Load and contact stress on the ball and socket joints have been taken, epically on the stress distribution in the contact area. Different
... Show MoreThe research aims to determine the mix of production optimization in the case of several conflicting objectives to be achieved at the same time, therefore, discussions dealt with the concept of programming goals and entrances to be resolved and dealt with the general formula for the programming model the goals and finally determine the mix of production optimization using a programming model targets to the default case.
The concepts of higher Bi- homomorphism and Jordan higher Bi- homomorphism have been introduced and studied the relation between Jordan and ordinary higher Bi- homomorphism also the concepts of Co- higher Bi- homomorphism and Co- Jordan higher Bi- homomorphism introduced and the relation between them in Banach algebra have also been studied.
The study aims to elucidation Difference distribution of the labor force by occupation in Sulaymaniyah governorate for the year 2013 by result field study to governorate and explain different Spatially for labor force by career. and The study reaches That Executive staff and Scribes and who join their high ratio from Total the labor force And the second Grade to Specialists and Technicians and who join their While Occupied career Production workers and who join their and Operators Transport Equipment and Engaged the third Grade from the total labor force and Continued Height in career Executive staff and who join their on the male labor force too . while Production workers in second Grade for male labor force , while the female labor for
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
The research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used
Facial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreThis paper is devoted to compare the performance of non-Bayesian estimators represented by the Maximum likelihood estimator of the scale parameter and reliability function of inverse Rayleigh distribution with Bayesian estimators obtained under two types of loss function specifically; the linear, exponential (LINEX) loss function and Entropy loss function, taking into consideration the informative and non-informative priors. The performance of such estimators assessed on the basis of mean square error (MSE) criterion. The Monte Carlo simulation experiments are conducted in order to obtain the required results.
Polarization modulation plays an important role in polarization encoding in quantum key distribution. By using polarization modulation, quantum key distribution systems become more compact and more vulnerable as one laser source is used instead of using multiple laser sources that may cause side-channel attacks. Metasurfaces with their exceptional optical properties have led to the development of versatile ultrathin optical devices. They are made up of planar arrays of resonant or nearly resonant subwavelength pieces and provide complete control over reflected and transmitted electromagnetic waves opening several possibilities for the development of innovative optical components. In this work, the Si nanowire metasurface grating polarize
... Show More