Preferred Language
Articles
/
khfT15IBVTCNdQwClMHN
Remote Data Auditing in a Cloud Computing Environment
...Show More Authors

In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud servers. There are four different techniques of remote data auditing procedures that are presented here for distributed cloud services. There are several difficulties associated with data audit methods; however, these difficulties may be overcome by using a variety of techniques, such as the Boneh-Lynn-Shacham signature or the automated blocker protocol. In addition to that, other difficulties associated with distributed-based remote data auditing solutions are discussed. In addition, a variety of approaches might be researched further for further examination in order to find answers to these impending problems.

Crossref
View Publication
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
Publication Date
Fri Jan 01 2016
Journal Name
Journal Of Sensors
WDARS: A Weighted Data Aggregation Routing Strategy with Minimum Link Cost in Event-Driven WSNs
...Show More Authors

Realizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost

... Show More
View Publication Preview PDF
Scopus (36)
Crossref (21)
Scopus Clarivate Crossref
Publication Date
Tue Mar 08 2022
Journal Name
International Journal Of Online And Biomedical Engineering (ijoe)
Data Hiding in 3D-Medical Image
...Show More Authors

Information hiding strategies have recently gained popularity in a variety of fields. Digital audio, video, and images are increasingly being labelled with distinct but undetectable marks that may contain a hidden copyright notice or serial number, or even directly help to prevent unauthorized duplication. This approach is extended to medical images by hiding secret information in them using the structure of a different file format. The hidden information may be related to the patient. In this paper, a method for hiding secret information in DICOM images is proposed based on Discrete Wavelet Transform (DWT). Firstly. segmented all slices of a 3D-image into a specific block size and collecting the host image depend on a generated key

... Show More
View Publication
Scopus (5)
Scopus Clarivate Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Communications
SDN Implementation in Data Center Network
...Show More Authors

View Publication
Scopus (19)
Crossref (14)
Scopus Crossref
Publication Date
Fri Jan 20 2023
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Study of the Land Cover of Razzaza Lake during the Past 25 Years Using Remote Sensing Methods
...Show More Authors

In this study, the Earth's surface was studied in Razzaza Lake for 25 years, using remote sensing methods. Images of the satellites Landsat 5 (TM) and 8 (OLI) were used to study and determine the components of the land cover. The study covered the years 1995-2021 with an interval of 5 years, as this region is uninhabited, so the change in the land cover is slow. The land cover was divided into three main classes and seven subclasses and classified using the maximum likelihood classifier with the help of training sets collected to represent the classes that made up the land cover. The changes detected in the land cover were studied by considering 1995 as a reference year. It was found that there was a significant reduction in the water mass

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Thu Aug 30 2018
Journal Name
Journal Of Engineering
Monitoring Land Cover Change Using Remote Sensing and GIS Techniques: a Case Study of Al-Dalmaj Marsh, Iraq
...Show More Authors

Al-Dalmaj marsh and the near surrounding area is a very promising area for energy resources, tourism, agricultural and industrial activities. Over the past century, the Al-Dalmaje marsh and near surroundings area endrous from a number of changes. The current study highlights the spatial and temporal changes detection in land cover for Al-Dalmaj marsh and near surroundings area using different analyses methods the supervised maximum likelihood classification method, the Normalized  Difference Vegetation Index (NDVI), Geographic Information Systems(GIS),  and Remote Sensing (RS). Techniques spectral indices were used in this study to determine the change of wetlands and drylands area and of other land classes, th

... Show More
View Publication Preview PDF
Crossref (15)
Crossref
Publication Date
Sat Aug 21 2021
Journal Name
Electronics
An Efficient Distributed Elliptic Positioning for Underground Remote Sensing
...Show More Authors

Remote surveying of unknown bound geometries, such as the mapping of underground water supplies and tunnels, remains a challenging task. The obstacles and absorption in media make the long-distance telecommunication and localization process inefficient due to mobile sensors’ power limitations. This work develops a new short-range sequential localization approach to reduce the required amount of signal transmission power. The developed algorithm is based on a sequential localization process that can utilize a multitude of randomly distributed wireless sensors while only employing several anchors in the process. Time delay elliptic and frequency range techniques are employed in developing the proposed algebraic closed-form solution.

... Show More
View Publication
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Fri Aug 01 2025
Journal Name
Radio Science
Integrating Drones With Digital Twins for Aerial Remote Sensing
...Show More Authors
Abstract<p>Drones are highly autonomous, remote‐controlled platforms capable of performing a variety of tasks in diverse environments. A digital twin (DT) is a virtual replica of a physical system. The integration of DT with drones gives the opportunity to manipulate the drone during a mission. In this paper, the architecture of DT is presented in order to explain how the physical environment can be represented. The techniques via which drones are collecting the necessary information for DT are compared as a next step to introduce the main methods that have been applied in DT progress by drones. The findings of this research indicated that the process of incorporating DTs into drones will result in the advanc</p> ... Show More
View Publication
Scopus Clarivate Crossref
Publication Date
Mon Dec 20 2021
Journal Name
Baghdad Science Journal
Crucial File Selection Strategy (CFSS) for Enhanced Download Response Time in Cloud Replication Environments
...Show More Authors

Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained an

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Clarivate Crossref