Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the controlling parameter on the AVO analysis. AVO cross plots from the real pre-stack seismic data reveal AVO class IV (showing a negative intercept decreasing with offset). This result matches our modelled result of fluid substitution for the seismic synthetics. It is concluded that fluid substitution is the controlling parameter on the AVO analysis and therefore, the high amplitude anomaly on the seabed and the target horizon 9 is the result of changing the fluid content and the lithology along the target horizons. While changing the porosity has little effect on the amplitude variation with offset within the AVO cross plot. Finally, results from the wedge models show that a small change of thickness causes a change in the amplitude; however, this change in thickness gives a different AVO characteristic and a mismatch with the AVO result of the real 2D pre-stack seismic data. Therefore, a constant thin layer with changing fluids is more likely to be the cause of the high amplitude anomalies.
Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement
ABSTRUCT
In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error ( λ ) in the model (SPSEM), estimated the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo
... Show MoreComposite steel-concrete sections have a broad benefit through increasing structural strength as well as minimizing the self-loads. All past researches were concerned with pre-installed shear connectors (PRSC) in the manufacturing of composite sections. A new fabrication technique for steel-concrete-steel composite sections were presented in the current study by the post-installation shear connectors (POSC) passed-through an embedded polymerizing vinyl chloride (PVC) pipes. The performance of normal strength concrete prisms with a specified strength of 32 MPa connected to square steel tubes (SST) was investigated. Six specimens were fabricated in both methodologies, PRSC and POSC were experimentally tested by Push-out test. The spac
... Show MoreIn this paper, a dynamic investigation is done for strip, rectangular and square machine foundation at the top surface of two-layer dry sand with various states (i.e., loose on medium sand and dense on medium sand). The dynamic investigation is performed numerically using finite element programming, PLAXIS 3D. The soil is expected as a versatile totally plastic material that complies with the Mohr-Coulomb yield criterion. A harmonic load is applied at the base with an amplitude of 6 kPa at a frequency of (2 and 6) Hz, and seismic is applied with acceleration – time input of earthquake hit Halabjah city north of Iraq. A parametric study is done to evaluate the influence of changing L/B ratio (Length=12,6,3 m and width=3 m), type of sand
... Show MoreIn this study used three methods such as Williamson-hall, size-strain Plot, and Halder-Wagner to analysis x-ray diffraction lines to determine the crystallite size and the lattice strain of the nickel oxide nanoparticles and then compare the results of these methods with two other methods. The results were calculated for each of these methods to the crystallite size are (0.42554) nm, (1.04462) nm, and (3.60880) nm, and lattice strain are (0.56603), (1.11978), and (0.64606) respectively were compared with the result of Scherrer method (0.29598) nm,(0.34245),and the Modified Scherrer (0.97497). The difference in calculated results Observed for each of these methods in this study.
The transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreObjective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show More