Preferred Language
Articles
/
bsj-5112
Advanced Intelligent Data Hiding Using Video Stego and Convolutional Neural Networks
...Show More Authors

Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file.  In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus, the exact way by which the network will hide the information is unable to be known to anyone who does not have the weights.  The second goal is to increase hiding capacity, which has been achieved by using CNN as a strategy to make decisions to determine the best areas that are redundant and, as a result, gain more size to be hidden. Furthermore, In the proposed model, CNN is concurrently trained to generate the revealing and hiding processes, and it is designed to work as a pair mainly. This model has a good strategy for the patterns of images, which assists to make decisions to determine which is the parts of the cover image should be redundant, as well as more pixels are hidden there. The CNN implementation can be done by using Keras, along with tensor flow backend. In addition, random RGB images from the "ImageNet dataset" have been used for training the proposed model (About 45000 images of size (256x256)). The proposed model has been trained by CNN using random images taken from the database of ImageNet and can work on images taken from a wide range of sources. By saving space on an image by removing redundant areas, the quantity of hidden data can be raised (improve capacity). Since the weights and model architecture are randomized, the actual method in which the network will hide the data can't be known to anyone who does not have the weights. Furthermore, additional block-shuffling is incorporated as an encryption method to improved security; also, the image enhancement methods are used to improving the output quality. From results, the proposed method has achieved high-security level, high embedding capacity. In addition, the result approves that the system achieves good results in visibility and attacks, in which the proposed method successfully tricks observer and the steganalysis program.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Jan 01 2023
Journal Name
Journal Of Engineering
Intelligent Congestion Control of 5G Traffic in SDN using Dual-Spike Neural Network
...Show More Authors

Software Defined Networking (SDN) with centralized control provides a global view and achieves efficient network resources management. However, using centralized controllers has several limitations related to scalability and performance, especially with the exponential growth of 5G communication. This paper proposes a novel traffic scheduling algorithm to avoid congestion in the control plane. The Packet-In messages received from different 5G devices are classified into two classes: critical and non-critical 5G communication by adopting Dual-Spike Neural Networks (DSNN) classifier and implementing it on a Virtualized Network Function (VNF). Dual spikes identify each class to increase the reliability of the classification

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Data Mining Techniques for Iraqi Biochemical Dataset Analysis
...Show More Authors

This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sun Sep 01 2019
Journal Name
Baghdad Science Journal
PWRR Algorithm for Video Streaming Process Using Fog Computing
...Show More Authors

       The most popular medium that being used by people on the internet nowadays is video streaming.  Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Wireless Communications And Mobile Computing
Corrigendum to “Developing a Video Buffer Framework for Video Streaming in Cellular Networks”
...Show More Authors

Scopus Clarivate Crossref
Publication Date
Sun Dec 17 2017
Journal Name
Al-khwarizmi Engineering Journal
Experimental and Prediction Using Artificial Neural Network of Bed Porosity and Solid Holdup in Viscous 3-Phase Inverse Fluidization
...Show More Authors

In the present investigation, bed porosity and solid holdup in viscous three-phase inverse fluidized bed (TPIFB) are determined for aqueous solutions of carboxy methyl cellulose (CMC) system using polyethylene and polypropylene as  a particles with low-density and diameter (5 mm) in a (9.2 cm) inner diameter with height (200 cm) of vertical perspex column. The effectiveness of gas velocity Ug , liquid velocity UL, liquid viscosity μL, and particle density ρs on bed porosity BP and solid holdups εg were determined. The bed porosity increases with "increasing gas velocity", "liquid velocity", and "liquid viscosity". Solid holdup decreases with increasing gas, liquid

... Show More
View Publication Preview PDF
Publication Date
Sun Apr 23 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Influence Activation Function in Approximate Periodic Functions Using Neural Networks
...Show More Authors

The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks.             In all algorithms, the gradient of the performance function (energy function) is used to determine how to

... Show More
View Publication Preview PDF
Publication Date
Sun Jan 01 2023
Journal Name
Petroleum And Coal
Analyzing of Production Data Using Combination of empirical Methods and Advanced Analytical Techniques
...Show More Authors

Scopus (1)
Scopus
Publication Date
Tue Jan 03 2023
Journal Name
College Of Islamic Sciences
Ruling on selling big data (Authentical Fiqh Study): Ruling on selling big data (Authentical Fiqh Study)
...Show More Authors

Abstract:

Research Topic: Ruling on the sale of big data

Its objectives: a statement of what it is, importance, source and governance.

The methodology of the curriculum is inductive, comparative and critical

One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it

 Recommendation: Follow-up of studies dealing with the provisions of the issue

Subject Terms

Judgment, Sale, Data, Mega, Sayings, Jurists

 

View Publication Preview PDF
Publication Date
Thu Jun 01 2023
Journal Name
Baghdad Science Journal
In vitro isolation and expansion of neural stem cells NSCs
...Show More Authors

   Neural stem cells (NSCs) are progenitor cells which have the ability to self‑renewal and potential for differentiating into neurons, oligodendrocytes, and astrocytes. The in vitro isolation, culturing, identification, cryopreservation were investigated to produce neural stem cells in culture as successful sources for further studies before using it for clinical trials. In this study, mouse bone marrow was the source of neural stem cells. The results of morphological study and immunocytochemistry of isolated cells showed that NSCs can be produced successfully and maintaining their self‑renewal and successfully forming neurosphere for multiple passages. The spheres preserved their morphology in culture and cryopreserved t

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Use projection pursuit regression and neural network to overcome curse of dimensionality
...Show More Authors

Abstract

This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model

... Show More
View Publication Preview PDF
Crossref (1)
Crossref