A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
In this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreSince the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data
... Show MoreIn this paper, a procedure to establish the different performance measures in terms of crisp value is proposed for two classes of arrivals and multiple channel queueing models, where both arrival and service rate are fuzzy numbers. The main idea is to convert the arrival rates and service rates under fuzzy queues into crisp queues by using graded mean integration approach, which can be represented as median rule number. Hence, we apply the crisp values obtained to establish the performance measure of conventional multiple queueing models. This procedure has shown its effectiveness when incorporated with many types of membership functions in solving queuing problems. Two numerical illustrations are presented to determine the validity of the
... Show MoreThe Contemporary Business Environment is Surrounded by many quick and continues variable and changes which has an effect on the economic units. These variables and changes like the high competition which need many tools to help them to continue and achieve The critical success. So to achieve this they have many competitive strategies like cost leadership strategy, differentiation strategy and focus strategy.
Budget is regarded one of main tools to execute objectives polices and programs of the economic units, beside show how the economic units had execute the available economic resources.
Activity based on budgeting is regarded one of the modern technique in the m
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreAbstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Abstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreThe topic of the working of the secondary event structure in the embodiment of the film unity is related to the ability to produce a film of controlled events that strengthen each other. The researchers divided the subject topic into an introduction and two sections, as follows: The first section is the event and the action in drama construction wherein the relationship of the dramatic act with the events in general and the secondary event in particular were studied as it has a relationship in a synergistic building of the film unity.
The second section was the patterns of the secondary event in the film wherein the researchers dealt with the patterns, types and functions
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show More