Preferred Language
Articles
/
uhgZX5QBVTCNdQwCeRNk
Subsurface Structural Image of Galabat Field, North East of Iraq Using 2D Seismic Data
...Show More Authors

This research had been achieved to identify the image of the subsurface structure representing the Tertiary period in the Galabat Field northeast of Iraq using 2D seismic survey measurements. Synthetic seismograms of the Galabat-3 well were generated in order to identify and pick the reflectors in seismic sections. Structural Images were drawn in the time domain and then converted to the depth domain by using average velocities. Structurally, seismic sections illustrate these reflectors are affected by two reverse faults affected on the Jeribe Formation and the layers below with the increase in the density of the reverse faults in the northern division. The structural maps show Galabat field, which consists of longitudinal Asymmetrical narrow anticline of Fatha and Jeribe formations, where the Southeastern limb is steeper than the Northeastern limb. The seismic interpretation shows that Galabat Field has a positive inverted structure, it is an anticline at the level of the Tertiary Period. The direction of the anticline axis and the major reverses faults are Northwest -Southeast. It is concluded from the study that reverse faults originated due to Zagros tectonism which is widespread in the area are a major conduit that channeled petroleum flow from source to Miocene traps. In addition, these faults were caused by the presence of salt accumulation within the Fatha Formation and led to high variation in the thickness in the crest and limbs of the Galabat structure.

Scopus Crossref
View Publication
Publication Date
Fri Mar 28 2008
Journal Name
Ama, Agricultural Mechanization In Asia, Africa & Latin America
The Effect of Two Type of Plows With Four Speeds on the Field Capacity and Bulk Density
...Show More Authors

Publication Date
Sat May 31 2025
Journal Name
Iraqi Journal For Computers And Informatics
Discussion on techniques of data cleaning, user identification, and session identification phases of web usage mining from 2000 to 2022
...Show More Authors

The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.

View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
The impact of governmental consumer spending on the development of the current account balance in Iraq for the period (1990-2014) using ARDL model
...Show More Authors

To avoid the negative effects due to inflexibility of the domestic production inresponse to the increase in government consumption expenditure leads to more  imports to meet the increase in domestic demand resulting from the increase in government consumption expenditure. Since the Iraqi economy economy yield unilateral depends on oil revenues to finance spending, and the fact government consumer spending is a progressive high flexibility the increase in overall revenues, while being a regressive flexibility is very low in the event of reduced public revenues, and therefore lead to a deficit in the current account position. And that caused the deficit for imbalance are the disruption of the

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Tue Aug 27 2024
Journal Name
Tem Journal
Preparing the Electrical Signal Data of the Heart by Performing Segmentation Based on the Neural Network U-Net
...Show More Authors

Research on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha

... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Application of data content analysis (DEA) technology to evaluate performance efficiency: applied research in the General Tax Authority
...Show More Authors

The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T

... Show More
View Publication Preview PDF
Publication Date
Fri Mar 31 2017
Journal Name
Iraqi Journal Of Biotechnology
Reliable Reference Gene for Normalization of RT- qPCR Data in Human Cancer Cell Lines
Subjected to Gene Knockdown
...Show More Authors

Quantitative real-time Polymerase Chain Reaction (RT-qPCR) has become a valuable molecular technique in biomedical research. The selection of suitable endogenous reference genes is necessary for normalization of target gene expression in RT-qPCR experiments. The aim of this study was to determine the suitability of each 18S rRNA and ACTB as internal control genes for normalization of RT-qPCR data in some human cell lines transfected with small interfering RNA (siRNA). Four cancer cell lines including MCF-7, T47D, MDA-MB-231 and Hela cells along with HEK293 representing an embryonic cell line were depleted of E2F6 using siRNA specific for E2F6 compared to negative control cells, which were transfected with siRNA not specific for any gene. Us

... Show More
Preview PDF
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Mon Apr 11 2011
Journal Name
Icgst
Employing Neural Network and Naive Bayesian Classifier in Mining Data for Car Evaluation
...Show More Authors

In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.

Publication Date
Fri Jan 01 2021
Journal Name
Journal Of Intelligent Systems
Void-hole aware and reliable data forwarding strategy for underwater wireless sensor networks
...Show More Authors
Abstract<p>Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co</p> ... Show More
View Publication Preview PDF
Scopus (8)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Jun 30 2024
Journal Name
International Journal Of Intelligent Engineering And Systems
Eco-friendly and Secure Data Center to Detection Compromised Devices Utilizing Swarm Approach
...Show More Authors

Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the

... Show More
View Publication
Scopus (4)
Scopus Crossref