Preferred Language
Articles
/
Sxctro0BVTCNdQwC3hjb
Secure Video Data Deduplication in the Cloud Storage Using Compressive Sensing
...Show More Authors

Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security

View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Jan 01 2008
Journal Name
Lecture Notes In Computer Science
IRPS – An Efficient Test Data Generation Strategy for Pairwise Testing
...Show More Authors

View Publication
Scopus (20)
Crossref (7)
Scopus Crossref
Publication Date
Sat Jul 22 2023
Journal Name
Journal Of Engineering
Data Acquisition System for Wind Speed, Direction and Temperature Measurements
...Show More Authors

This paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation cl

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Improved Firefly Algorithm with Variable Neighborhood Search for Data Clustering
...Show More Authors

Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the

... Show More
View Publication Preview PDF
Scopus (13)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Wed Oct 01 2008
Journal Name
2008 First International Conference On Distributed Framework And Applications
A strategy for Grid based t-way test data generation
...Show More Authors

View Publication
Scopus (22)
Crossref (16)
Scopus Crossref
Publication Date
Tue Oct 08 2002
Journal Name
Iraqi Journal Of Laser
Design Considerations of Laser Source in a Ring Network Based on Fiber Distributed Data Interface (FDDI)
...Show More Authors

This work presents the use of laser diode in the fiber distributed data interface FDDI networks. FDDI uses optical fiber as a transmission media. This solves the problems resulted from the EMI, and noise. In addition it increases the security of transmission. A network with a ring topology consists of three computers was designed and implemented. The timed token protocol was used to achieve and control the process of communication over the ring. Nonreturn to zero inversion (NRZI) modulation was carried out as a part of the physical (PHY) sublayer. The optical system consists of a laser diode with wavelength of 820 nm and 2.5 mW maximum output power as a source, optical fiber as a channel, and positive intrinsic negative (PIN) photodiode

... Show More
View Publication Preview PDF
Publication Date
Fri Dec 30 2022
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Normalize and De-Normalize of Relative Permeability Data for Mishrif Formation in WQ1: An Experimental Work
...Show More Authors

In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (31)
Crossref (23)
Scopus Clarivate Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Interpretation of Mud Losses in Carbonates Based on Cuttings Description, Well-Logging, Seismic and Coherency Data
...Show More Authors

    Hartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.

   The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
A Study on Transportation Models in Their Minimum and Maximum Values with Applications of Real Data
...Show More Authors

The purpose of this paper is to apply different transportation models in their minimum and maximum values by finding starting basic feasible solution and finding the optimal solution. The requirements of transportation models were presented with one of their applications in the case of minimizing the objective function, which was conducted by the researcher as real data, which took place one month in 2015, in one of the poultry farms for the production of eggs

... Show More
View Publication Preview PDF
Crossref