Preferred Language
Articles
/
jcoeduw-1054
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Apr 03 2023
Journal Name
Journal Of Electronics,computer Networking And Applied Mathematics
Comparison of Some Estimator Methods of Regression Mixed Model for the Multilinearity Problem and High – Dimensional Data
...Show More Authors

In order to obtain a mixed model with high significance and accurate alertness, it is necessary to search for the method that performs the task of selecting the most important variables to be included in the model, especially when the data under study suffers from the problem of multicollinearity as well as the problem of high dimensions. The research aims to compare some methods of choosing the explanatory variables and the estimation of the parameters of the regression model, which are Bayesian Ridge Regression (unbiased) and the adaptive Lasso regression model, using simulation. MSE was used to compare the methods.

View Publication
Crossref
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
Accounting Mining Data Using Neural Networks (Case study)
...Show More Authors

Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed Jul 06 2022
Journal Name
Journal Of Asian Multicultural Research For Social Sciences Study
Remote Data Auditing in a Cloud Computing Environment
...Show More Authors

In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv

... Show More
View Publication
Crossref
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Tue Mar 30 2021
Journal Name
Wasit Journal Of Computer And Mathematics Science
Dynamic Data Replication for Higher Availability and Security
...Show More Authors

The paradigm and domain of data security is the key point as per the current era in which the data is getting transmitted to multiple channels from multiple sources. The data leakage and security loopholes are enormous and there is need to enforce the higher levels of security, privacy and integrity. Such sections incorporate e-administration, long range interpersonal communication, internet business, transportation, coordinations, proficient correspondences and numerous others. The work on security and trustworthiness is very conspicuous in the systems based situations and the private based condition. This examination original copy is exhibiting the efficacious use of security based methodology towards the execution with blockchain

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some NONPARAMETRIC ESTIMATORS FOR RIGHT CENSORED SURVIVAL DATA
...Show More Authors

The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.

For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jul 01 2021
Journal Name
Journal Of Engineering
Proposed Security Framework for Mobile Data Management System
...Show More Authors

Portable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail,  appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls,  intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fu

... Show More
View Publication Preview PDF
Crossref (4)
Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Data Mining Techniques for Iraqi Biochemical Dataset Analysis
...Show More Authors

This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Sep 08 2018
Journal Name
Proceedings Of The 2018 International Conference On Computing And Big Data
3D Parallel Coordinates for Multidimensional Data Cube Exploration
...Show More Authors

Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Bulletin Of Electrical Engineering And Informatics
Traffic management inside software-defined data centre networking
...Show More Authors

In recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur

... Show More
View Publication
Scopus (16)
Crossref (13)
Scopus Crossref