Preferred Language
Articles
/
khfT15IBVTCNdQwClMHN
Remote Data Auditing in a Cloud Computing Environment
...Show More Authors

In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud servers. There are four different techniques of remote data auditing procedures that are presented here for distributed cloud services. There are several difficulties associated with data audit methods; however, these difficulties may be overcome by using a variety of techniques, such as the Boneh-Lynn-Shacham signature or the automated blocker protocol. In addition to that, other difficulties associated with distributed-based remote data auditing solutions are discussed. In addition, a variety of approaches might be researched further for further examination in order to find answers to these impending problems.

Crossref
View Publication
Publication Date
Thu Sep 29 2022
Journal Name
World Journal Of Clinical Infectious Diseases
Five-year retrospective hospital-based study on epidemiological data regarding human leishmaniasis in West Kordofan state, Sudan
...Show More Authors

View Publication
Crossref (2)
Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Sep 01 2021
Journal Name
Baghdad Science Journal
The Vertical variations of Atmospheric Methane (CH4) concentrations over selected cities in Iraq based on AIRS data
...Show More Authors

The Atmospheric Infrared Sounder (AIRS) on EOS/Aqua satellite provides diverse measurements of Methane (CH4) distribution at different pressure levels in the Earth's atmosphere. The focus of this research is to analyze the vertical variations of (CH4) volume mixing ratio (VMR) time-series data at four Standard pressure levels SPL (925, 850, 600, and 300 hPa) in the troposphere above six cities in Iraq from January 2003 to September 2016. The analysis results of monthly average CH4VMR time-series data show a significant increase between 2003 and 2016, especially from 2009 to 2016; the minimum values of CH4 were in 2003 while the maximum values were in 2016. The vertical distribution of CH4<

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Ieee Access
Implementation of Univariate Paradigm for Streamflow Simulation Using Hybrid Data-Driven Model: Case Study in Tropical Region
...Show More Authors

View Publication
Scopus (93)
Crossref (89)
Scopus Clarivate Crossref
Publication Date
Wed Dec 01 2021
Journal Name
Baghdad Science Journal
Advanced Intelligent Data Hiding Using Video Stego and Convolutional Neural Networks
...Show More Authors

Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file.  In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sun Feb 10 2019
Journal Name
Journal Of The College Of Education For Women
IMPLEMENTATION OF THE SKIP LIST DATA STRUCTURE WITH IT'S UPDATE OPERATIONS
...Show More Authors

A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.

View Publication Preview PDF
Publication Date
Mon Oct 09 2023
Journal Name
2023 Ieee 34th International Symposium On Software Reliability Engineering Workshops (issrew)
Semantics-Based, Automated Preparation of Exploratory Data Analysis for Complex Systems
...Show More Authors

View Publication
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Mar 01 2022
Journal Name
Asian Journal Of Applied Sciences
Comparison between Expert Systems, Machine Learning, and Big Data: An Overview
...Show More Authors

Today, the science of artificial intelligence has become one of the most important sciences in creating intelligent computer programs that simulate the human mind. The goal of artificial intelligence in the medical field is to assist doctors and health care workers in diagnosing diseases and clinical treatment, reducing the rate of medical error, and saving lives of citizens. The main and widely used technologies are expert systems, machine learning and big data. In the article, a brief overview of the three mentioned techniques will be provided to make it easier for readers to understand these techniques and their importance.

View Publication
Crossref (2)
Crossref
Publication Date
Sat Mar 01 2008
Journal Name
Iraqi Journal Of Physics
Comparison between Different Data Image Compression Techniques Applied on SAR Images
...Show More Authors

In this paper, image compression technique is presented based on the Zonal transform method. The DCT, Walsh, and Hadamard transform techniques are also implements. These different transforms are applied on SAR images using Different block size. The effects of implementing these different transforms are investigated. The main shortcoming associated with this radar imagery system is the presence of the speckle noise, which affected the compression results.

View Publication Preview PDF