Preferred Language
Articles
/
ijcpe-1068
Interpretation of Mud Losses in Carbonates Based on Cuttings Description, Well-Logging, Seismic and Coherency Data
...Show More Authors

    Hartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.

   The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potential mud losses, resulting in high NPT. This is due to its diagenetic features such as succrosic dolomites and vuggy zones that could act as thief zones. Seismic potential was used for the prediction of the geological related non-productive drilling time in the Hartha interval. The seismic data quality in this interval was good, with geological observations already made. Detailed interpretation and analysis of the Hartha interval were performed and integrated with the existing seismic interpretation, rock properties, and NPT database to calibrate wells with the loss events to the seismic observations.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Mar 03 2009
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of repetitive estimation methodsSelf-data
...Show More Authors

In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation)  structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of  these procedures and compare them using generated data.

View Publication Preview PDF
Crossref
Publication Date
Sun Mar 17 2019
Journal Name
Baghdad Science Journal
A Study on the Accuracy of Prediction in Recommendation System Based on Similarity Measures
...Show More Authors

Recommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n

... Show More
View Publication Preview PDF
Scopus (17)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
A Comparison Between the Theoretical Cross Section Based on the Partial Level Density Formulae Calculated by the Exciton Model with the Experimental Data for (_79^197)Au nucleus
...Show More Authors

In this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction  at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted  in the theoretical cross section and compared with the experimental data for  nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when  doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with  the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi

... Show More
View Publication Preview PDF
Scopus (7)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Thu Jan 30 2020
Journal Name
Journal Of Engineering
Design and Analysis WIMAX Network Based on Coverage Planning
...Show More Authors

In this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Feb 04 2019
Journal Name
Journal Of The College Of Education For Women
Image Watermarking based on Huffman Coding and Laplace Sharpening
...Show More Authors

In this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform

... Show More
View Publication Preview PDF
Publication Date
Tue Oct 18 2022
Journal Name
Ieee Access
Plain, Edge, and Texture Detection Based on Orthogonal Moment
...Show More Authors

Image pattern classification is considered a significant step for image and video processing. Although various image pattern algorithms have been proposed so far that achieved adequate classification, achieving higher accuracy while reducing the computation time remains challenging to date. A robust image pattern classification method is essential to obtain the desired accuracy. This method can be accurately classify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism. Moreover, to date, most of the existing studies are focused on evaluating their methods based on specific orthogonal moments, which limits the understanding of their potential application to various Discrete Orthogonal Moments (DOM

... Show More
Scopus (13)
Crossref (14)
Scopus Clarivate Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Multifocus Images Fusion Based On Homogenity and Edges Measures
...Show More Authors

Image fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 04 2022
Journal Name
Ieee Access
Plain, Edge, and Texture Detection Based on Orthogonal Moment
...Show More Authors

Image pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The

... Show More
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Wed Mar 01 2023
Journal Name
International Journal Of Research In Social Sciences And Humanities
The Role of Value Chain Analysis as well as Programs and Performance Budget in Reducing Waste of Public Money(Applied Study)
...Show More Authors

Programs and performance budget represents a sophisticated method of public budget numbers, which includes all allocations to be determined for each job or activity within a government entity, which is analyzed according to their needs and costs, and this method can be applied using one of the cost accounting techniques, which is the technique of analyzing the value chain that reduces costs by avoiding activities that do not add value and enhance activities that add value to the economic entity, the current research aims to develop the budget system in government entity by using the budget of programs and performance as a tool for planning and monitoring events and activities, thereby reducing the waste of public money by reducing unnecessa

... Show More