Preferred Language
Articles
/
rRZ_l4cBVTCNdQwC1VfG
Ensure Security of Compressed Data Transmission
...Show More Authors

Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is preprocessed and transform of into some intermediate form which can be compressed with better efficiency and security. This solves some problems relevant to the common encryption methods which generally manipulate an entire data set, most encryption algorithms tend to make the transfer of information more costly in terms of time and sometimes bandwidth.

Scopus Clarivate Crossref
View Publication
Publication Date
Fri Oct 01 2010
Journal Name
2010 Ieee Symposium On Industrial Electronics And Applications (isiea)
Distributed t-way test suite data generation using exhaustive search method with map and reduce framework
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Wed Feb 08 2023
Journal Name
Iraqi Journal Of Science
Subsurface 3D Prediction Porosity Model from Converted Seismic and Well Data Using Model Based Inversion Technique
...Show More Authors

Seismic inversion technique is applied to 3D seismic data to predict porosity property for carbonate Yamama Formation (Early Cretaceous) in an area located in southern Iraq. A workflow is designed to guide the manual procedure of inversion process. The inversion use a Model Based Inversion technique to convert 3D seismic data into 3D acoustic impedance depending on low frequency model and well data is the first step in the inversion with statistical control for each inversion stage. Then, training the 3D acoustic impedance volume, seismic data and porosity wells data with multi attribute transforms to find the best statistical attribute that is suitable to invert the point direct measurement of porosity from well to 3D porosity distribut

... Show More
View Publication Preview PDF
Publication Date
Wed Apr 25 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Using Approximation Non-Bayesian Computation with Fuzzy Data to Estimation Inverse Weibull Parameters and Reliability Function
...Show More Authors

        In real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson” and the “Expectation-Maximization” techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (30)
Crossref (23)
Scopus Clarivate Crossref
Publication Date
Sat Jan 01 2022
Journal Name
The International Journal Of Nonlinear Analysis And Applications
Developing Bulk Arrival Queuing Models with Constant Batch Policy Under Uncertainty Data Using (0-1) Variables
...Show More Authors

This paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b

... Show More
Publication Date
Thu Jun 20 2019
Journal Name
Baghdad Science Journal
An Optimised Method for Fetching and Transforming Survey Data based on SQL and R Programming Language
...Show More Authors

The development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste

... Show More
View Publication Preview PDF
Crossref (1)
Clarivate Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Oct 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Assessing Service Quality using Data Envelopment analysis Case study at the Iraqi Middle East Investment Bank
...Show More Authors

The use of data envelopment analysis method helps to improve the performance of organizations in order to exploit their resources efficiently in order to improve the service quality. represented study a problem in need of the Iraqi Middle East Investment Bank to assess the performance of bank branches, according to the service quality provided, Thus, the importance of the study is to contribute using a scientific and systematic method by applying  the data envelopment analysis method in assessing the service quality provided by the bank branches, The study focused on achieving the goal of determining the efficiency of the  services quality provided by the bank branches manner which reflect the extent of utilization of a

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (257)
Crossref (243)
Scopus Clarivate Crossref
Publication Date
Tue Mar 21 2023
Journal Name
International Journal Of Interactive Mobile Technologies (ijim)
Study the Effect of Using Google Classroom on the Academic Performance Under the Covid19 Pandemic Using Data Mining Technique
...Show More Authors

— In light of the pandemic that has swept the world, the use of e-learning in educational institutions has become an urgent necessity for continued knowledge communication with students. Educational institutions can benefit from the free tools that Google provide and from these applications, Google classroom which is characterized by ease of use, but the efficiency of using Google classroom is affected by several variables not studied in previous studies Clearly, this study aimed to identify the use of Google classroom as a system for managing e-learning and the factors affecting the performance of students and lecturer. The data of this study were collected from 219 members of the faculty and students at the College of Administra

... Show More
View Publication
Scopus Crossref