The gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km. The depths were selected using the power spectrum analysis technique of gravity data. Gravity data are inverted based on gravitational anomalies for each depth slice or level and the extracted equivalent depth data from available wells using a connection curve between densities and velocities, which were mostly compatible with Nafe and Drake's standard curve. The inverted gravity data images highlight the behavior of anomalies/structures in the model and domain of density/velocity, which can be utilized in the processing of the recorded seismic data and time to depth conversion, in parallel with available well's data information within the intended study area of South-Western Iraq.
The Plerion nebula is characterized by its pulsar that fills the center of the supernova remnant with radio and X-ray frequencies. In our galaxy there are nine naked plerionic systems known, of which the Crab Nebula is the best-known example. It has been studied this instance in order to investigate how the pulsar energy affect on the distribution and evolution of the remnant as well as study the pulsar kick velocity and its influence on the remnant. From the obtained results it's found that, the pulsar of the Crab Nebula injects about (2−3)𝑥 1047 erg of energy to the remnant, although this energy is small compared to the supernova explosion energy which is about 1051 erg but still plays a significant role in the distribution and the m
... Show MoreThe research aims to shed light on the possibility of measuring the intellectual capital in the Iraqi insurance company using accounting models, as well as disclosing it in the financial statements of the company, where human capital was measured using the present value factor model for discounted future revenues and the intellectual value-added factor model for measuring structural capital It was also disclosed in the financial statements based on the theory of stakeholders. The research problem lies in the fact that the Iraqi insurance company does not carry out the process of measuring and disclosing the intellectual capital while it is considered an important source for the company’s progress in the labor market recently. T
... Show MoreThe cross section evaluation for (α,n) reaction was calculated according to the available International Atomic Energy Agency (IAEA) and other experimental published data . These cross section are the most recent data , while the well known international libraries like ENDF , JENDL , JEFF , etc. We considered an energy range from threshold to 25 M eV in interval (1 MeV). The average weighted cross sections for all available experimental and theoretical(JENDL) data and for all the considered isotopes was calculated . The cross section of the element is then calculated according to the cross sections of the isotopes of that element taking into account their abundance . A mathematical representative equation for each of the element
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreAmplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show More