Preferred Language
Articles
/
jcoeduw-1054
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Mon Aug 01 2022
Journal Name
Baghdad Science Journal
A Novel Technique for Secure Data Cryptosystem Based on Chaotic Key Image Generation
...Show More Authors

The advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sun Mar 01 2015
Journal Name
Journal Of Engineering
Multi-Sites Multi-Variables Forecasting Model for Hydrological Data using Genetic Algorithm Modeling
...Show More Authors

A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was

... Show More
View Publication Preview PDF
Publication Date
Sun Dec 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Contemporary Challenges for Cloud Computing Data Governance in Information Centers: An analytical study
...Show More Authors

Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Geological Journal
Evaluating Machine Learning Techniques for Carbonate Formation Permeability Prediction Using Well Log Data
...Show More Authors

Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To

... Show More
View Publication
Scopus (13)
Crossref (6)
Scopus Crossref
Publication Date
Wed Nov 30 2022
Journal Name
Iraqi Geological Journal
A Predictive Model for Estimating Unconfined Compressive Strength from Petrophysical Properties in the Buzurgan Oilfield, Khasib Formation, Using Log Data
...Show More Authors

Unconfined compressive strength (UCS) of rock is the most critical geomechanical property widely used as input parameters for designing fractures, analyzing wellbore stability, drilling programming and carrying out various petroleum engineering projects. The USC regulates rock deformation by measuring its strength and load-bearing capacity. The determination of UCS in the laboratory is a time-consuming and costly process. The current study aims to develop empirical equations to predict UCS using regression analysis by JMP software for the Khasib Formation in the Buzurgan oil fields, in southeastern Iraq using well-log data. The proposed equation accuracy was tested using the coefficient of determination (R²), the average absolute

... Show More
View Publication
Crossref
Publication Date
Mon Mar 18 2019
Journal Name
Al-khwarizmi Engineering Journal
Best Level of Parameters for a Critical Buckling Load for Circular Thin- Walled Structure Subjected to Bending
...Show More Authors

Circular thin walled structures have wide range of applications. This type of structure is generally exposed to different types of loads, but one of the most important types is a buckling. In this work, the phenomena of buckling was studied by using finite element analysis. The circular thin walled structure in this study is constructed from; cylindrical thin shell strengthen by longitudinal stringers, subjected to pure bending in one plane. In addition, Taguchi method was used to identify the optimum combination set of parameters for enhancement of the critical buckling load value, as well as to investigate the most effective parameter. The parameters that have been analyzed were; cylinder shell thickness, shape of stiffeners section an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Sep 08 2019
Journal Name
Applied Organometallic Chemistry
Phosphorus‐based Schiff bases and their complexes as nontoxic antioxidants: Structure–activity relationship and mechanism of action
...Show More Authors

Phosphorus‐based Schiff base were synthesized by treating bis{3‐[2‐(4‐amino‐1.5‐dimethyl‐2‐phenyl‐pyrazol‐3‐ylideneamino)ethyl]‐indol‐1‐ylmethyl}‐phosphinic acid with paraformaldehyde and characterized as a novel antioxidant. Its corresponding complexes [(VO)2L(SO4)2], [Ni2LCl4], [Co2LCl4], [Cu2LCl4], [Zn2LCl4], [Cd2LCl4], [Hg2LCl4], [Pd2LCl4], and [PtL

... Show More
View Publication
Scopus (35)
Crossref (18)
Scopus Clarivate Crossref
Publication Date
Sat Dec 01 2018
Journal Name
Indian Journal Of Natural Sciences
Geology and Structure Analysis of Shaqlawa – Merawa Area, Northern Iraq Using Remote Sensing, GIS and Field Observations
...Show More Authors

This study appears GIS techniqueand remote sensing data are matching with the field observation to identify the structural features such as fault segments in the urban area such as the Merawa and Shaqlawa Cities. The use of different types of data such as fault systems, drainage patterns (previously mapped), lineament, and lithological contacts with spatial resolution of 30m was combined through a process of integration and index overlay modeling technique for producing the susceptibility map of fault segments in the study area. GIS spatial overlay technique was used to determine the spatial relationships of all the criteria (factors) and subcriteria (classes) within layers (maps) to classify and map the potential ar

... Show More
View Publication