Preferred Language
Articles
/
bsj-1301
An Embedded Data Using Slantlet Transform
...Show More Authors

Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image recovery after applying JPEG coding to the watermarking image are included.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Dec 01 2013
Journal Name
Diyala Journal Of Engineering Sciences
Design and Simulation of parallel CDMA System Based on 3D-Hadamard Transform
...Show More Authors

Future wireless systems aim to provide higher transmission data rates, improved spectral efficiency and greater capacity. In this paper a spectral efficient two dimensional (2-D) parallel code division multiple access (CDMA) system is proposed for generating and transmitting (2-D CDMA) symbols through 2-D Inter-Symbol Interference (ISI) channel to increase the transmission speed. The 3D-Hadamard matrix is used to generate the 2-D spreading codes required to spread the two-dimensional data for each user row wise and column wise. The quadrature amplitude modulation (QAM) is used as a data mapping technique due to the increased spectral efficiency offered. The new structure simulated using MATLAB and a comparison of performance for ser

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Sep 25 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Algorithm to Solve Linear Volterra Fractional Integro-Differential Equation via Elzaki Transform
...Show More Authors

       In this work, Elzaki transform (ET) introduced by Tarig Elzaki is applied to solve linear Volterra fractional integro-differential equations (LVFIDE). The fractional derivative is considered in the Riemman-Liouville sense. The procedure is based on the application of (ET) to (LVFIDE) and using properties of (ET) and its inverse. Finally, some examples are solved to show that this is computationally efficient and accurate.

View Publication Preview PDF
Publication Date
Sun Sep 24 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Algorithm to Solve Linear Volterra Fractional Integro-Differential Equation via Elzaki Transform
...Show More Authors

In this work, Elzaki transform (ET) introduced by Tarig Elzaki is applied to solve linear Volterra fractional integro-differential equations (LVFIDE). The fractional derivative is considered in the Riemman-Liouville sense. The procedure is based on the application of (ET) to (LVFIDE) and using properties of (ET) and its inverse. Finally, some examples are solved to show that this is computationally efficient and accurate.

View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON
...Show More Authors

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re

... Show More
Preview PDF
Scopus (1)
Scopus
Publication Date
Sat Oct 08 2022
Journal Name
Aro-the Scientific Journal Of Koya University
Data Analytics and Techniques
...Show More Authors

Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide

... Show More
View Publication
Scopus (12)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Dec 01 2013
Journal Name
2013 Sixth International Conference On Developments In Esystems Engineering
Ensure Security of Compressed Data Transmission
...Show More Authors

Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p

... Show More
View Publication
Scopus (4)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Mon Apr 01 2019
Journal Name
2019 International Conference On Automation, Computational And Technology Management (icactm)
Multi-Resolution Hierarchical Structure for Efficient Data Aggregation and Mining of Big Data
...Show More Authors

Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Thu Sep 28 2023
Journal Name
Iraqi Journal Of Computer, Communication, Control And System Engineering
A Comparative Study Between the Fragile and Robust Watermarking Techniques and Proposing New Fragile Watermarking with Embedded Value Technique
...Show More Authors

Since the Internet has been more widely used and more people have access to multimedia content, copyright hacking, and piracy have risen. By the use of watermarking techniques, security, asset protection, and authentication have all been made possible. In this paper, a comparison between fragile and robust watermarking techniques has been presented to benefit them in recent studies to increase the level of security of critical media. A new technique has been suggested when adding an embedded value (129) to each pixel of the cover image and representing it as a key to thwart the attacker, increase security, rise imperceptibility, and make the system faster in detecting the tamper from unauthorized users. Using the two watermarking ty

... Show More
View Publication
Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Technologies And Materials For Renewable Energy, Environment And Sustainability: Tmrees20
Change detection of the land cover for three decades using remote sensing data and geographic information system
...Show More Authors

View Publication
Crossref (3)
Crossref