Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the controlling parameter on the AVO analysis. AVO cross plots from the real pre-stack seismic data reveal AVO class IV (showing a negative intercept decreasing with offset). This result matches our modelled result of fluid substitution for the seismic synthetics. It is concluded that fluid substitution is the controlling parameter on the AVO analysis and therefore, the high amplitude anomaly on the seabed and the target horizon 9 is the result of changing the fluid content and the lithology along the target horizons. While changing the porosity has little effect on the amplitude variation with offset within the AVO cross plot. Finally, results from the wedge models show that a small change of thickness causes a change in the amplitude; however, this change in thickness gives a different AVO characteristic and a mismatch with the AVO result of the real 2D pre-stack seismic data. Therefore, a constant thin layer with changing fluids is more likely to be the cause of the high amplitude anomalies.
A rapid high performance liquid chromatography method for the determination of sphinganine (Sa) and sphingosine (So) in urine samples by employing a silica-based monolithic column is described. The samples were first extracted using ethyl acetate and derivatized using ortho-phthaldialdehyde in the presence of 2-mercaptoethanol. C20 sphinganine was used as internal standard. Under the optimized conditions, separation was achieved using a mixture of methanol:water (93:7, v/v), column temperature at 30°C, flow rate of 1 mL min−1, and an injection volume of 10 μL. Good linearity was obtained for Sa and So over the concentration range 20–500 ng mL−1(correlation coefficients ≥0.9978). The detection limits were 0.45 ng mL−1 for Sa and
... Show MoreThe Bangestan reservoir, which occurs in the Ahwaz oilfield, consists of the middle Cretaceous limestone Ilam and Sarvak Formations that were deposited in the Zagros Basin. The reservoir is divided into ten Zones (A to J) formed in the upper Albian-Santonian and contains considerable hydrocarbon accumulations. The limestones were deposited on an extensive shallow carbonate platform on a passive margin and are dominated by rudist biostrome and grainstone facies. Paleogeographical changes mean that identification of the facies is complex. Seismic stratigraphy and isotopic data are used to better understand the structural and geological setting and develop an understanding of the sedimentary environment. The results show that the rudist biostr
... Show MoreAbstract
In this research we study the wavelet characteristics for the important time series known as Sunspot, on the aim of verifying the periodogram that other researchers had reached by the spectral transform, and noticing the variation in the period length on one side and the shifting on another.
A continuous wavelet analysis is done for this series and the periodogram in it is marked primarily. for more accuracy, the series is partitioned to its the approximate and the details components to five levels, filtering these components by using fixed threshold on one time and independent threshold on another, finding the noise series which represents the difference between
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreAbstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show More