A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreMachine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreUnconfined compressive strength (UCS) of rock is the most critical geomechanical property widely used as input parameters for designing fractures, analyzing wellbore stability, drilling programming and carrying out various petroleum engineering projects. The USC regulates rock deformation by measuring its strength and load-bearing capacity. The determination of UCS in the laboratory is a time-consuming and costly process. The current study aims to develop empirical equations to predict UCS using regression analysis by JMP software for the Khasib Formation in the Buzurgan oil fields, in southeastern Iraq using well-log data. The proposed equation accuracy was tested using the coefficient of determination (R²), the average absolute
... Show MoreCircular thin walled structures have wide range of applications. This type of structure is generally exposed to different types of loads, but one of the most important types is a buckling. In this work, the phenomena of buckling was studied by using finite element analysis. The circular thin walled structure in this study is constructed from; cylindrical thin shell strengthen by longitudinal stringers, subjected to pure bending in one plane. In addition, Taguchi method was used to identify the optimum combination set of parameters for enhancement of the critical buckling load value, as well as to investigate the most effective parameter. The parameters that have been analyzed were; cylinder shell thickness, shape of stiffeners section an
... Show MorePhosphorus‐based Schiff base were synthesized by treating bis{3‐[2‐(4‐amino‐1.5‐dimethyl‐2‐phenyl‐pyrazol‐3‐ylideneamino)ethyl]‐indol‐1‐ylmethyl}‐phosphinic acid with paraformaldehyde and characterized as a novel antioxidant. Its corresponding complexes [(VO)2L(SO4)2], [Ni2LCl4], [Co2LCl4], [Cu2LCl4], [Zn2LCl4], [Cd2LCl4], [Hg2LCl4], [Pd2LCl4], and [PtL
... Show MoreThis study appears GIS techniqueand remote sensing data are matching with the field observation to identify the structural features such as fault segments in the urban area such as the Merawa and Shaqlawa Cities. The use of different types of data such as fault systems, drainage patterns (previously mapped), lineament, and lithological contacts with spatial resolution of 30m was combined through a process of integration and index overlay modeling technique for producing the susceptibility map of fault segments in the study area. GIS spatial overlay technique was used to determine the spatial relationships of all the criteria (factors) and subcriteria (classes) within layers (maps) to classify and map the potential ar
... Show More