Shear wave velocity is an important feature in the seismic exploration that could be utilized in reservoir development strategy and characterization. Its vital applications in petrophysics, seismic, and geomechanics to predict rock elastic and inelastic properties are essential elements of good stability and fracturing orientation, identification of matrix mineral and gas-bearing formations. However, the shear wave velocity that is usually obtained from core analysis which is an expensive and time-consuming process and dipole sonic imager tool is not commonly available in all wells. In this study, a statistical method is presented to predict shear wave velocity from wireline log data. The model concentrated to predict shear wave velocity from petrophysical parameters and any pair of compressional wave velocity, porosity and density in carbonate rocks. The established method can estimate shear wave velocity in carbonate rocks with a correlation coefficient of close to unity.
A new Species of the Cerambycinae belonging to the genus Hesperophanes was found new to the fauna of Iraq and Science. H. testaceus was studied in details and the male genitalia were illustrated. Type's paratypes and the locality of this newly described Species were mentioned.
The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreThis paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.
Tor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreObjective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreA multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreCorpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show More