Preferred Language
Articles
/
gRd-Qo8BVTCNdQwCzmef
AVO analysis for high amplitude anomalies using 2D pre-stack seismic data
...Show More Authors

Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the controlling parameter on the AVO analysis. AVO cross plots from the real pre-stack seismic data reveal AVO class IV (showing a negative intercept decreasing with offset). This result matches our modelled result of fluid substitution for the seismic synthetics. It is concluded that fluid substitution is the controlling parameter on the AVO analysis and therefore, the high amplitude anomaly on the seabed and the target horizon 9 is the result of changing the fluid content and the lithology along the target horizons. While changing the porosity has little effect on the amplitude variation with offset within the AVO cross plot. Finally, results from the wedge models show that a small change of thickness causes a change in the amplitude; however, this change in thickness gives a different AVO characteristic and a mismatch with the AVO result of the real 2D pre-stack seismic data. Therefore, a constant thin layer with changing fluids is more likely to be the cause of the high amplitude anomalies.

Scopus Clarivate Crossref
View Publication
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
A New Green Approach of CFIA Technique for Direct Assay with a High Throughput of Sulfamethoxazole Drugs Using Condensation Reaction with NQS Agent
...Show More Authors

A new design of manifold flow injection (FI) coupling with a merging zone technique was studied for sulfamethoxazole determination spectrophotometrically. The semiautomated FI method has many advantages such as being fast, simple, highly accurate, economical with high throughput . The suggested method based on the production of the orange- colored compound of SMZ with (NQS)1,2-Naphthoquinone-4-Sulphonic acid Sodium salt in alkaline media NaOH at λmax 496nm.The linearity range of sulfamethoxazole was  3-100  μg. mL-1, with (LOD) was 0.593 μg. mL-1 and the RSD% is about 1.25 and the recovery is 100.73%. All various physical and chemical parameters that have an effect on the stability and development of

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Iraqi Journal Of Science
Application of 2D Electrical Resistivity Method and Ground Penetration Rader for Detection of the Archaeological Remains in Kish Site, Babylon, Iraq
...Show More Authors

     The 2D electrical resistivity imaging (ERI) is a non-destructive method with good efficiency to detect shallow subsurface features. The archeological subsurface features were investigated with this method in most cases with the assistance of other methods such as GPR method. Eleven 2D ERI profiles were carried out to investigate the subsurface archeological features in the Kish site in the Babylon area. The 2D electrical resistivity survey was achieved with ABEM Terrameter-LS2 Device and 30 electrodes with 1-meter spacing between the adjacent electrodes along each profile. The length of the profile is 29 meters and the spacing between the adjacent profiles is 3 meters. The software RES2DINV was used to obtain the final inverted

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Oct 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
''The use of factor analysis to identify the leading factors to high blood pressure.''A field study in Baghdad hospitals
...Show More Authors

Abstract :

    In view of the fact that high blood pressure is one of the serious human diseases that a person can get without having to feel them, which is caused by many reasons therefore it became necessary to do research in this subject and to express these many factors by specific causes through studying it using (factor analysis).

  So the researcher got to the five factors that explains only 71% of the total variation in this phenomenon is the subject of the research, where ((overweight)) and ((alcohol in abundance)) and ((smoking)) and ((lack of exercise)) are the reasons that influential the most in the incidence of this disease.

View Publication Preview PDF
Crossref
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (3)
Scopus
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Southwest Jiaotong University
Recognizing Job Apathy Patterns of Iraqi Higher Education Employees Using Data Mining Techniques
...Show More Authors

Psychological research centers help indirectly contact professionals from the fields of human life, job environment, family life, and psychological infrastructure for psychiatric patients. This research aims to detect job apathy patterns from the behavior of employee groups in the University of Baghdad and the Iraqi Ministry of Higher Education and Scientific Research. This investigation presents an approach using data mining techniques to acquire new knowledge and differs from statistical studies in terms of supporting the researchers’ evolving needs. These techniques manipulate redundant or irrelevant attributes to discover interesting patterns. The principal issue identifies several important and affective questions taken from

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Petroleum And Coal
Analyzing of Production Data Using Combination of empirical Methods and Advanced Analytical Techniques
...Show More Authors

Scopus (1)
Scopus
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Wed Jan 01 2020
Journal Name
Advances In Science, Technology And Engineering Systems Journal
Bayes Classification and Entropy Discretization of Large Datasets using Multi-Resolution Data Aggregation
...Show More Authors

Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a

... Show More
View Publication
Scopus Crossref
Publication Date
Tue Nov 30 2021
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms

... Show More
View Publication Preview PDF