Preferred Language
Articles
/
jcoeduw-1054
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu May 18 2023
Journal Name
Journal Of Engineering
Implementation of Digital Image processing in Calculating Normal Approach for Spherical Indenter Considering Elastic/Plastic Contact
...Show More Authors

In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jan 01 2015
Journal Name
Journal Of Engineering
A Visual Interface Design for Evaluating the Quality of Google Map Data for some Engineering Applications
...Show More Authors

Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses

... Show More
View Publication
Publication Date
Sat Feb 01 2020
Journal Name
Iop Conference Series: Materials Science And Engineering
Revealing the potentials of 3D modelling techniques; a comparison study towards data fusion from hybrid sensors
...Show More Authors
Abstract<p>The vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc</p> ... Show More
View Publication
Scopus (12)
Crossref (7)
Scopus Clarivate Crossref
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Data Mining Techniques for Iraqi Biochemical Dataset Analysis
...Show More Authors

This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Sep 08 2018
Journal Name
Proceedings Of The 2018 International Conference On Computing And Big Data
3D Parallel Coordinates for Multidimensional Data Cube Exploration
...Show More Authors

Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Bulletin Of Electrical Engineering And Informatics
Traffic management inside software-defined data centre networking
...Show More Authors

In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur

... Show More
View Publication
Scopus (16)
Crossref (13)
Scopus Crossref
Publication Date
Mon Sep 01 2008
Journal Name
Al-khwarizmi Engineering Journal
New Adaptive Data Transmission Scheme Over HF Radio
...Show More Authors

Acceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.

An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo

... Show More
View Publication Preview PDF
Publication Date
Tue Mar 30 2021
Journal Name
Wasit Journal Of Computer And Mathematics Science
Dynamic Data Replication for Higher Availability and Security
...Show More Authors

The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Asian Multicultural Research For Social Sciences Study
Remote Data Auditing in a Cloud Computing Environment
...Show More Authors

In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv

... Show More
View Publication
Crossref