The aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial values for the parameters and initial value b, to get to estimator balanced add between two loss function ,moreover, the optimal sample size determination under proposed entropy loss function.
DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreThe Purpose of this study is mainly to improve the competitive position of products economic units using technique target cost and method reverse engineering and through the application of technique and style on one of the public sector companies (general company for vegetable oils) which are important in the detection of prices accepted in the market for items similar products and processing the problem of high cost which attract managerial and technical leadership to the weakness that need to be improved through the introduction of new innovative solutions which make appropriate change to satisfy the needs of consumers in a cheaper way to affect the decisions of private customer to buy , especially of purchase private economic units to
... Show MoreIn this paper we proposes the philosophy of the Darwinian selection as synthesis method called Genetic algorithm ( GA ), and include new merit function with simple form then its uses in other works for designing one of the kinds of multilayer optical filters called high reflection mirror. Here we intend to investigate solutions for many practical problems. This work appears designed high reflection mirror that have good performance with reduction the number of layers, which can enable one to controlling the errors effect of the thickness layers on the final product, where in this work we can yield such a solution in a very shorter time by controlling the length of the chromosome and optimal genetic operators . Res
... Show MoreThe aim of this work is to evaluate the one- electron expectation value from the radial electronic density function D(r1) for different wave function for the 2S state of Be atom . The wave function used were published in 1960,1974and 1993, respectavily. Using Hartree-Fock wave function as a Slater determinant has used the partitioning technique for the analysis open shell system of Be (1s22s2) state, the analyze Be atom for six-pairs electronic wave function , tow of these are for intra-shells (K,L) and the rest for inter-shells(KL) . The results are obtained numerically by using computer programs (Mathcad).
In this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreBackground: The aim of the study was to evaluate the amount of changes in the horizontal and vertical maxillary arch dimensions measurements following the premature loss of primary molars. Materials and methods: The sample consist of (50) children with unilateral prematurely extracted either first or second primary molars at the mixed dentition stage. Results and Conclusions: Results shows that there was an increase in the vertical incisor to canine distance (A) with both premature loss of first & second primary molars due to distal movement of primary canines and at the same time there were a significant loss of space in the extraction space with premature loss of second primary molar due to a mesial movement of maxillary first permanent
... Show MoreCompanies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreMany production companies suffers from big losses because of high production cost and low profits for several reasons, including raw materials high prices and no taxes impose on imported goods also consumer protection law deactivation and national product and customs law, so most of consumers buy imported goods because it is characterized by modern specifications and low prices.
The production company also suffers from uncertainty in the cost, volume of production, sales, and availability of raw materials and workers number because they vary according to the seasons of the year.
I had adopted in this research fuzzy linear program model with fuzzy figures
... Show More