This paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in the cultivation of wheat crop in Iraq, where he was be considered the amount of production is the basic data (variable primary) and want to estimate a single points of non measured and the cultivated area (variable secondary) has been programming all calculations language Matlab
Interested current Research measuring damage currency Swap by converting The ministry of higher Education and scientific Research money The Iraqi dinar To U.S dollar by Trade Bank Of Iraq , And that The damage Generated resulting from Deferent Between the Exchange Rate adopted From Central Bank of Iraq and Market Exchange Rate adopted by The Trade Bank Of Iraq , and Which led to the greet damage ( losses ) in Bearing by the ministry, which led to the reduction of the financial allocations for licensed curriculum outside of Iraq , and this in turn leads to reduction in the number of students Sender ( scholarships ) outside Iraq.
Where the estimated loss (damage) that suffer by the Ministry of H
... Show MoreAbstract
The research aims to identify the application of corporate governance requirements according to the international standard (ISO 26000:2010) in the National Insurance Company. Strengths and weaknesses were identified to study the current state of this requirement's application in the company under investigation. The descriptive-analytical approach was utilized through a checklist derived from ISO 26000:2010. Several personal interviews and field visits were conducted to understand the extent of application and documentation based on various statistical methods. The results revealed a level of applic
... Show MoreIn multivariate survival analysis, estimating the multivariate distribution functions and then measuring the association between survival times are of great interest. Copula functions, such as Archimedean Copulas, are commonly used to estimate the unknown bivariate distributions based on known marginal functions. In this paper the feasibility of using the idea of local dependence to identify the most efficient copula model, which is used to construct a bivariate Weibull distribution for bivariate Survival times, among some Archimedean copulas is explored. Furthermore, to evaluate the efficiency of the proposed procedure, a simulation study is implemented. It is shown that this approach is useful for practical situations and applicable fo
... Show MoreMechanical rock properties are essential to minimize many well problems during drilling and production operations. While these properties are crucial in designing optimum mud weights during drilling operations, they are also necessary to reduce the sanding risk during production operations. This study has been conducted on the Zubair sandstone reservoir, located in the south of Iraq. The primary purpose of this study is to develop a set of empirical correlations that can be used to estimate the mechanical rock properties of sandstone reservoirs. The correlations are established using laboratory (static) measurements and well logging (dynamic) data. The results support the evidence that porosity and sonic travel time are consistent i
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.
The term "tight reservoir" is commonly used to refer to reservoirs with low permeability. Tight oil reservoirs have caused worry owing to its considerable influence upon oil output throughout the petroleum sector. As a result of its low permeability, producing from tight reservoirs presents numerous challenges. Because of their low permeability, producing from tight reservoirs is faced with a variety of difficulties. The research aim is to performing hydraulic fracturing treatment in single vertical well in order to study the possibility of fracking in the Saady reservoir. Iraq's Halfaya oil field's Saady B reservoir is the most important tight reservoir. The diagnostic fracture injection test is determined for HF55using GOHFER soft
... Show MoreThis paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).