Preferred Language
Articles
/
bsj-6142
Steganography and Cryptography Techniques Based Secure Data Transferring Through Public Network Channel
...Show More Authors

Attacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover.  The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels with two secret random keys. Therefore, the hidden message remains protected even if the stego-object is hacked because the attacker is unable to know the correct frames and pixels that hold each bit of the secret message in addition to difficulty to successfully rebuild the message. The results refer to that the proposed scheme provides a good performance for evaluation metric that is used in this purpose when compared to a large number of related previous methods.

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Jan 01 2022
Journal Name
Methods And Objects Of Chemical Analysis
Spectrophotometric Analysis of Quaternary Drug Mixtures using Artificial Neural network model
...Show More Authors

A Novel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twentyfour samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.

View Publication
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Iaes International Journal Of Artificial Intelligence
Reduced hardware requirements of deep neural network for breast cancer diagnosis
...Show More Authors

Identifying breast cancer utilizing artificial intelligence technologies is valuable and has a great influence on the early detection of diseases. It also can save humanity by giving them a better chance to be treated in the earlier stages of cancer. During the last decade, deep neural networks (DNN) and machine learning (ML) systems have been widely used by almost every segment in medical centers due to their accurate identification and recognition of diseases, especially when trained using many datasets/samples. in this paper, a proposed two hidden layers DNN with a reduction in the number of additions and multiplications in each neuron. The number of bits and binary points of inputs and weights can be changed using the mask configuration

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (2)
Scopus Crossref
Publication Date
Thu Apr 04 2024
Journal Name
Chemchemtech
ANALYTICAL TECHNIQUES IN PHARMACEUTICAL POLLUTION OF THE WORLD’S RIVERS; A REVIEW
...Show More Authors

Recent reports of new pollution issues brought on by the presence of medications in the aquatic environment have sparked a great deal of interest in studies aiming at analyzing and mitigating the associated environmental risks, as well as the extent of this contamination. The main sources of pharmaceutical contaminants in natural lakes and rivers include clinic sewage, pharmaceutical production wastewater, and sewage from residences that have been contaminated by drug users' excretions. In evaluating the health of rivers, pharmaceutical pollutants have been identified as one of the emerging pollutants. The previous studies showed that the contaminants in pharmaceuticals that are widely used are non-steroidal anti-inflammatory drugs, ant

... Show More
Scopus (9)
Scopus
Publication Date
Thu Mar 09 2023
Journal Name
Coatings
Nondestructive Evaluation of Fiber-Reinforced Polymer Using Microwave Techniques: A Review
...Show More Authors

Carbon-fiber-reinforced polymer (CFRP) is widely acknowledged as a leading advanced material structure, offering superior properties compared to traditional materials, and has found diverse applications in several industrial sectors, such as that of automobiles, aircrafts, and power plants. However, the production of CFRP composites is prone to fabrication problems, leading to structural defects arising from cycling and aging processes. Identifying these defects at an early stage is crucial to prevent service issues that could result in catastrophic failures. Hence, routine inspection and maintenance are crucial to prevent system collapse. To achieve this objective, conventional nondestructive testing (NDT) methods are utilized to i

... Show More
View Publication
Scopus (14)
Crossref (13)
Scopus Clarivate Crossref
Publication Date
Tue Sep 06 2022
Journal Name
Methods And Objects Of Chemical Analysis
Spectrophotometric Analysis of Quaternary Drug Mixtures using Artificial Neural network model
...Show More Authors

A Novel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twenty four samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.

Scopus (2)
Scopus
Publication Date
Sat Oct 01 2022
Journal Name
Baghdad Science Journal
Offline Signature Biometric Verification with Length Normalization using Convolution Neural Network
...Show More Authors

Offline handwritten signature is a type of behavioral biometric-based on an image. Its problem is the accuracy of the verification because once an individual signs, he/she seldom signs the same signature. This is referred to as intra-user variability. This research aims to improve the recognition accuracy of the offline signature. The proposed method is presented by using both signature length normalization and histogram orientation gradient (HOG) for the reason of accuracy improving. In terms of verification, a deep-learning technique using a convolution neural network (CNN) is exploited for building the reference model for a future prediction. Experiments are conducted by utilizing 4,000 genuine as well as 2,000 skilled forged signatu

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sun May 01 2011
Journal Name
Information Sciences
Design and implementation of a t-way test data generation strategy with automated execution tool support
...Show More Authors

View Publication
Scopus (66)
Crossref (52)
Scopus Clarivate Crossref
Publication Date
Sat Jan 01 2011
Journal Name
International Journal Of Data Analysis Techniques And Strategies
A class of efficient and modified testimators for the mean of normal distribution using complete data
...Show More Authors

View Publication
Scopus (9)
Crossref (2)
Scopus Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Journal Of Business, Communication & Technology
Exploring the Adoption of Big Data Analytics in the Oil and Gas Industry: A Case Study
...Show More Authors

The oil and gas industry relies heavily on IT innovations to manage business processes, but the exponential generation of data has led to concerns about processing big data, generating valuable insights, and making timely decisions. Many companies have adopted Big Data Analytics (BDA) solutions to address these challenges. However, determining the adoption of BDA solutions requires a thorough understanding of the contextual factors influencing these decisions. This research explores these factors using a new Technology-Organisation-Environment (TOE) framework, presenting technological, organisational, and environmental factors. The study used a Delphi research method and seven heterogeneous panelists from an Oman oil and gas company

... Show More
View Publication Preview PDF
Crossref (1)
Crossref