Preferred Language
Articles
/
2xhzZpQBVTCNdQwCNBQ7
Inverting Gravity Data to Density and Velocity Models for Selected Area in Southwestern Iraq
...Show More Authors

The gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km. The depths were selected using the power spectrum analysis technique of gravity data. Gravity data are inverted based on gravitational anomalies for each depth slice or level and the extracted equivalent depth data from available wells using a connection curve between densities and velocities, which were mostly compatible with Nafe and Drake's standard curve. The inverted gravity data images highlight the behavior of anomalies/structures in the model and domain of density/velocity, which can be utilized in the processing of the recorded seismic data and time to depth conversion, in parallel with available well's data information within the intended study area of South-Western Iraq.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 01 2020
Journal Name
Al-khwarizmi Engineering Journal
Studying the Radial and Tangential Velocity Components of the Epithelization Healing Post Photorefractive Keratectomy Surgery of the Human Eye
...Show More Authors

 

Photorefractive keratectomy (PRK) is the refractive technique that began with a physical scraping of the epithelial layer of cornea subsequent by laser treatment. Post this procedure to about 48 hours the removed epithelial layer regenerated to protect the eye again. The regeneration process (called re-epithelization) started from the limbus of the cornea toward the central part of it. The re-epithelization mechanism consists of a change in cell density (mitosis) and cell concentration (migration) with a velocity in two directions: radial and tangential. In the present study, an estimation for both radial (responsible for the overlapped layers toward the outward direction of the cornea) and tangential comp

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Feb 20 2019
Journal Name
Iraqi Journal Of Physics
The measurements of neutron Fermi Age for selected Nuclear Reactor shielding materials using the Indium foil technique
...Show More Authors

The Neutron Fermi Age, t, and the neutron slowing down density,   q (r, t) , have been measured for some materials such as Graphite and Iron by using gamma spectrometry system UCS-30 with NaI (Tl) detector. This technique was applied for Graphite and Iron materials by using Indium foils covered by Cadmium and the measurements done at the Indium resonance of 1.46 eV. These materials are exposed to a plane 241Am/Be neutron source with recent activity 38 mCi. The measurements of the Fermi Age were found to be t = 297 ± 21 cm2 for Graphite, t = 400 ± 28 cm2 for Iron. Neutron slowing down density was also calculated depending on the recent experimental t value and distance.

View Publication Preview PDF
Crossref
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Poisson Regression and Conway Maxwell Poisson Models Using Simulation
...Show More Authors

Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well-  Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.

Paper type:

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
Proposing Robust LAD-Atan Penalty of Regression Model Estimation for High Dimensional Data
...Show More Authors

         The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Geological Journal
Evaluating Machine Learning Techniques for Carbonate Formation Permeability Prediction Using Well Log Data
...Show More Authors

Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To

... Show More
View Publication
Scopus (11)
Crossref (6)
Scopus Crossref
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
Analytical Approach for Load Capacity of Large Diameter Bored Piles Using Field Data
...Show More Authors

An analytical approach based on field data was used to determine the strength capacity of large diameter bored type piles. Also the deformations and settlements were evaluated for both vertical and lateral loadings. The analytical predictions are compared to field data obtained from a proto-type test pile used at Tharthar –Tigris canal Bridge. They were found to be with acceptable agreement of 12% deviation.

               Following ASTM standards D1143M-07e1,2010, a test schedule of five loading cycles were proposed for vertical loads and series of cyclic loads to simulate horizontal loading .The load test results and analytical data of 1.95

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
Proposing Robust LAD-Atan Penalty of Regression Model Estimation for High Dimensional Data
...Show More Authors

         The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref