This paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
In this work the diode planer magnetron sputtering device was
designed and fabricated. This device consists of two aluminum discs
(8cm) diameter and (5mm) thick. The distance between the two
electrodes is 2cm, 3cm, 4cm and 5cm.
Design and construction a double probe of tungsten wire with
(0.1mm) diameter and (1.2mm) length has been done to investigate
electron temperature, electron and ion density under different
distances between cathode and anode. The probes were situated in
the center of plasma between anode and cathode.
The results of this work show that, when the distance between
cathode and anode increased, the electron temperature decreased.
Also, the electron density increases with the increasing
In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreDue to advancements in computer science and technology, impersonation has become more common. Today, biometrics technology is widely used in various aspects of people's lives. Iris recognition, known for its high accuracy and speed, is a significant and challenging field of study. As a result, iris recognition technology and biometric systems are utilized for security in numerous applications, including human-computer interaction and surveillance systems. It is crucial to develop advanced models to combat impersonation crimes. This study proposes sophisticated artificial intelligence models with high accuracy and speed to eliminate these crimes. The models use linear discriminant analysis (LDA) for feature extraction and mutual info
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreThis article aims to determine the time-dependent heat coefficient together with the temperature solution for a type of semi-linear time-fractional inverse source problem by applying a method based on the finite difference scheme and Tikhonov regularization. An unconditionally stable implicit finite difference scheme is used as a direct (forward) solver. While by the MATLAB routine lsqnonlin from the optimization toolbox, the inverse problem is reformulated as nonlinear least square minimization and solved efficiently. Since the problem is generally incorrect or ill-posed that means any error inclusion in the input data will produce a large error in the output data. Therefore, the Tikhonov regularization technique is applie
... Show MoreIn this study, we present a new steganography method depend on quantizing the perceptual color spaces bands. Four perceptual color spaces are used to test the new method which is HSL, HSV, Lab and Luv, where different algorithms to calculate the last two-color spaces are used. The results reveal the validity of this method as a steganoic method and analysis for the effects of quantization and stegano process on the quality of the cover image and the quality of the perceptual color spaces bands are presented.
In this paper, a computational method for solving optimal problem is presented, using indirect method (spectral methodtechnique) which is based on Boubaker polynomial. By this method the state and the adjoint variables are approximated by Boubaker polynomial with unknown coefficients, thus an optimal control problem is transformed to algebraic equations which can be solved easily, and then the numerical value of the performance index is obtained. Also the operational matrices of differentiation and integration have been deduced for the same polynomial to help solving the problems easier. A numerical example was given to show the applicability and efficiency of the method. Some characteristics of this polynomial which can be used for solvin
... Show More