Shear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity from petrophysical parameters and any pair of compressional wave velocity, porosity and density in carbonate rocks. The established method can estimate shear wave velocity in carbonate rocks with a correlation coefficient of close to unity.
Abstract:In this paper, some probability characteristics functions (moments, variances,convariance, and spectral density functions) are found depending upon the smallestvariance of the solution of some stochastic Fredholm integral equation contains as aknown function, the sine wave function
Poverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.
This paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThis paper is devoted to investigate experimentally and theoretically the structural behavior of reinforced concrete hollow beams which have internal transverse ribs under effect of shear. The number of the internal ribs is the major variable adopted in this research, while, the other variables are kept constant for all tested specimens. The experimental part includes poured and test of four (200x300x1200mm) beam specimens, three of these specimens were hollow with different locations of internal ribs and one of them was solid. The experimental results indicated that the shear strength are increased (33%) to (60%) for beams containing internal ribs in comparison with reference beam. Also, the change of beam state from ho
... Show MoreCorpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreThe deployment of UAVs is one of the key challenges in UAV-based communications while using UAVs for IoT applications. In this article, a new scheme for energy efficient data collection with a deadline time for the Internet of things (IoT) using the Unmanned Aerial Vehicles (UAV) is presented. We provided a new data collection method, which was set to collect IoT node data by providing an efficient deployment and mobility of multiple UAV, used to collect data from ground internet of things devices in a given deadline time. In the proposed method, data collection was done with minimum energy consumption of IoTs as well as UAVs. In order to find an optimal solution to this problem, we will first provide a mixed integer linear programming m
... Show More