Preferred Language
Articles
/
Sxctro0BVTCNdQwC3hjb
Secure Video Data Deduplication in the Cloud Storage Using Compressive Sensing
...Show More Authors

Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Jan 12 2013
Journal Name
Pierb
RADAR SENSING FEATURING BICONICAL ANTENNA AND ENHANCED DELAY AND SUM ALGORITHM FOR EARLY-STAGE BREAST CANCER DETECTION
...Show More Authors

A biconical antenna has been developed for ultra-wideband sensing. A wide impedance bandwidth of around 115% at bandwidth 3.73-14 GHz is achieved which shows that the proposed antenna exhibits a fairly sensitive sensor for microwave medical imaging applications. The sensor and instrumentation is used together with an improved version of delay and sum image reconstruction algorithm on both fatty and glandular breast phantoms. The relatively new imaging set-up provides robust reconstruction of complex permittivity profiles especially in glandular phantoms, producing results that are well matched to the geometries and composition of the tissues. Respectively, the signal-to-clutter and the signal-to-mean ratios of the improved method are consis

... Show More
Publication Date
Thu Aug 01 2024
Journal Name
The American Journal Of Management And Economics Innovations
THE ROLE OF ECONOMIC DATA ANALYSIS IN MANAGING MEDIUM AND SMALL COMPANIES TO MAKE STRATEGIC DECISIONS AND IMPROVE PERFORMANCE: AN ANALYTICAL STUDY
...Show More Authors

Economic analysis plays a pivotal role in managerial decision-making processes. This analysis is predicated on deeply understanding economic forces and market factors influencing corporate strategies and decisions. This paper delves into the role of economic data analysis in managing small and medium-sized enterprises (SMEs) to make strategic decisions and enhance performance. The study underscores the significance of this approach and its impact on corporate outcomes. The research analyzes annual reports from three companies: Al-Mahfaza for Mobile and Internet Financial Payment and Settlement Services Company Limited, Al-Arab for Electronic Payment Company, and Iraq Electronic Gateway for Financial Services Company. The paper concl

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Economics And Administrative Sciences
A Comparison Between Some Estimator Methods of Linear Regression Model With Auto-Correlated Errors With Application Data for the Wheat in Iraq
...Show More Authors

This research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
A Comparative Study of Some Methods of Estimating Robust Variance Covariance Matrix of the Parameters Estimated by (OLS) in Cross-Sectional Data
...Show More Authors

 

Abstract

The Classical Normal Linear Regression Model Based on Several hypotheses, one of them is Heteroscedasticity as it is known that the wing of least squares method (OLS), under the existence of these two problems make the estimators, lose their desirable properties, in addition the statistical inference becomes unaccepted table. According that we put tow alternative,  the first one is  (Generalized Least Square) Which is denoted by (GLS), and the second alternative is to (Robust covariance matrix estimation) the estimated parameters method(OLS), and that the way (GLS) method neat and certified, if the capabilities (Efficient) and the statistical inference Thread on the basis of an acceptable

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Mar 08 2022
Journal Name
Multimedia Tools And Applications
Comparison study on the performance of the multi classifiers with hybrid optimal features selection method for medical data diagnosis
...Show More Authors

View Publication
Scopus (3)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 08 2002
Journal Name
Iraqi Journal Of Laser
Design Considerations of Laser Source in a Ring Network Based on Fiber Distributed Data Interface (FDDI)
...Show More Authors

This work presents the use of laser diode in the fiber distributed data interface FDDI networks. FDDI uses optical fiber as a transmission media. This solves the problems resulted from the EMI, and noise. In addition it increases the security of transmission. A network with a ring topology consists of three computers was designed and implemented. The timed token protocol was used to achieve and control the process of communication over the ring. Nonreturn to zero inversion (NRZI) modulation was carried out as a part of the physical (PHY) sublayer. The optical system consists of a laser diode with wavelength of 820 nm and 2.5 mW maximum output power as a source, optical fiber as a channel, and positive intrinsic negative (PIN) photodiode

... Show More
View Publication Preview PDF
Publication Date
Fri Dec 30 2022
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Normalize and De-Normalize of Relative Permeability Data for Mishrif Formation in WQ1: An Experimental Work
...Show More Authors

In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Interpretation of Mud Losses in Carbonates Based on Cuttings Description, Well-Logging, Seismic and Coherency Data
...Show More Authors

    Hartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.

   The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (31)
Crossref (24)
Scopus Clarivate Crossref