The research dealt with a comparative study between some semi-parametric estimation methods to the Partial linear Single Index Model using simulation. There are two approaches to model estimation two-stage procedure and MADE to estimate this model. Simulations were used to study the finite sample performance of estimating methods based on different Single Index models, error variances, and different sample sizes , and the mean average squared errors were used as a comparison criterion between the methods were used. The results showed a preference for the two-stage procedure depending on all the cases that were used
This paper displays a survey about the laboratory routine core analysis study on ten sandstone core samples taken from Zubair Reservoir/West Quarna Oil Field. The Petrophysical properties of rock as porosity, permeability, grain's size, roundness and sorting, type of mineral and volumes of shales inside the samples were tested by many apparatus in the Petroleum Technology Department/ University of Technology such as OFITE BLP-530 Gas Porosimeter, PERG-200TM Gas Permeameter and liquid Permeameter, GeoSpec2 apparatus (NMR method), Scanning Electron Microscopy (SEM) and OFITE Spectral Gamma Ray Logger apparatus. By comparing all the results of porosity and permeability measured by these instruments, it is clear a significant vari
... Show MoreThe basic concept of diversity; where two or more inputs at the receiver are used to get uncorrelated signals. The aim of this paper is an attempt to compare some possible combinations of diversity reception and MLSE detection techniques. Various diversity combining techniques can be distinguished: Equal Gain Combining (EGC), Maximal Ratio Combining (MRC), Selection Combining and Selection Switching Combining (SS).The simulation results shows that the MRC give better performance than the other types of combining (about 1 dB compare with EGC and 2.5~3 dB compare with selection and selection switching combining).
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreIn this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreWe present a reliable algorithm for solving, homogeneous or inhomogeneous, nonlinear ordinary delay differential equations with initial conditions. The form of the solution is calculated as a series with easily computable components. Four examples are considered for the numerical illustrations of this method. The results reveal that the semi analytic iterative method (SAIM) is very effective, simple and very close to the exact solution demonstrate reliability and efficiency of this method for such problems.
This paper shews how to estimate the parameter of generalized exponential Rayleigh (GER) distribution by three estimation methods. The first one is maximum likelihood estimator method the second one is moment employing estimation method (MEM), the third one is rank set sampling estimator method (RSSEM)The simulation technique is used for all these estimation methods to find the parameters for generalized exponential Rayleigh distribution. Finally using the mean squares error criterion to compare between these estimation methods to find which of these methods are best to the others
Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show More