Preferred Language
Articles
/
BRcoPo8BVTCNdQwCamXJ
IMPROVED STRUCTURE OF DATA ENCRYPTION STANDARD ALGORITHM
...Show More Authors

The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.

Publication Date
Mon Apr 24 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Estimate the Parameters and Related Probability Functions for Data of the Patients of Lymph Glands Cancer via Birnbaum-Saunders Model
...Show More Authors

 In this paper,we estimate the parameters and related probability functions, survival function, cumulative distribution function , hazard function(failure rate) and failure  (death) probability function(pdf) for two parameters Birnbaum-Saunders distribution which is fitting the complete data for the patients of  lymph glands cancer. Estimating the parameters (shape and scale) using (maximum likelihood , regression quantile and shrinkage) methods and then compute the value of mentioned related probability  functions depending on sample from real data which describe the duration of survivor for patients who suffer from the lymph glands cancer based on diagnosis of disease or the inter of patients in a hospital for perio

... Show More
View Publication Preview PDF
Publication Date
Sat May 31 2025
Journal Name
Iraqi Journal For Computers And Informatics
Discussion on techniques of data cleaning, user identification, and session identification phases of web usage mining from 2000 to 2022
...Show More Authors

The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.

View Publication Preview PDF
Crossref
Publication Date
Fri Dec 30 2022
Journal Name
Journal Of The College Of Education For Women
The Exploratory and Confirmatory Factorial Structure of Test-Wiseness Scale: A Field Study on a Sample of Students in Hama University
...Show More Authors

The current research aims to recognize the exploratory and confirmatory factorial structure of the test-wiseness scale on a sample of Hama University students, using the descriptive method. Thus, the sample consists of (472) male and female students from the faculties of the University of Hama. Besides, Abu Hashem’s 50 item test-wiseness scale (2008) has been used. The validity and reliability of the items of the scale have also been verified, and six items have been deleted accordingly. The results of the exploratory factor analysis of the first degree have shown the presence of the following five acceptable factors: (exam preparation, test time management, question paper handling, answer sheet handling, and revision).  Moreover,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Digital Image Authentication Algorithm Based on Fragile Invisible Watermark and MD-5 Function in the DWT Domain
...Show More Authors

Using watermarking techniques and digital signatures can better solve the problems of digital images transmitted on the Internet like forgery, tampering, altering, etc. In this paper we proposed invisible fragile watermark and MD-5 based algorithm for digital image authenticating and tampers detecting in the Discrete Wavelet Transform DWT domain. The digital image is decomposed using 2-level DWT and the middle and high frequency sub-bands are used for watermark and digital signature embedding. The authentication data are embedded in number of the coefficients of these sub-bands according to the adaptive threshold based on the watermark length and the coefficients of each DWT level. These sub-bands are used because they a

... Show More
View Publication Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Dynamic algorithm (DRBLTS) and potentially weighted (WBP) to estimate hippocampal regression parameters using a techniqueBootstrap (comparative study)
...Show More Authors

Bootstrap is one of an important re-sampling technique which has given the attention of  researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such  Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri May 17 2013
Journal Name
International Journal Of Computer Applications
Applied Minimized Matrix Size Algorithm on the Transformed Images by DCT and DWT used for Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref
Publication Date
Tue Jan 22 2019
Journal Name
Horticulturae
Variable Pulsed Irrigation Algorithm (VPIA) to Reduce Runoff Losses under a Low-Pressure Lateral Move Irrigation Machine
...Show More Authors

Due to restrictions and limitations on agricultural water worldwide, one of the most effective ways to conserve water in this sector is to reduce the water losses and improve irrigation uniformity. Nowadays, the low-pressure sprinkler has been widely used to replace the high-pressure impact sprinklers in lateral move sprinkler irrigation systems due to its low operating cost and high efficiency. However, the hazard of surface runoff represents the biggest obstacle for low-pressure sprinkler systems. Most researchers have used the pulsing technique to apply variable-rate irrigation to match the crop water needs within a normal application rate that does not produce runoff. This research introduces a variable pulsed irrigation algorit

... Show More
View Publication
Scopus (13)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Sat Jan 12 2013
Journal Name
Pierb
RADAR SENSING FEATURING BICONICAL ANTENNA AND ENHANCED DELAY AND SUM ALGORITHM FOR EARLY-STAGE BREAST CANCER DETECTION
...Show More Authors

A biconical antenna has been developed for ultra-wideband sensing. A wide impedance bandwidth of around 115% at bandwidth 3.73-14 GHz is achieved which shows that the proposed antenna exhibits a fairly sensitive sensor for microwave medical imaging applications. The sensor and instrumentation is used together with an improved version of delay and sum image reconstruction algorithm on both fatty and glandular breast phantoms. The relatively new imaging set-up provides robust reconstruction of complex permittivity profiles especially in glandular phantoms, producing results that are well matched to the geometries and composition of the tissues. Respectively, the signal-to-clutter and the signal-to-mean ratios of the improved method are consis

... Show More
Publication Date
Fri Jul 30 2021
Journal Name
Iraqi Journal For Electrical And Electronic Engineering
EEG Motor-Imagery BCI System Based on Maximum Overlap Discrete Wavelet Transform (MODWT) and Machine learning algorithm
...Show More Authors

The ability of the human brain to communicate with its environment has become a reality through the use of a Brain-Computer Interface (BCI)-based mechanism. Electroencephalography (EEG) has gained popularity as a non-invasive way of brain connection. Traditionally, the devices were used in clinical settings to detect various brain diseases. However, as technology advances, companies such as Emotiv and NeuroSky are developing low-cost, easily portable EEG-based consumer-grade devices that can be used in various application domains such as gaming, education. This article discusses the parts in which the EEG has been applied and how it has proven beneficial for those with severe motor disorders, rehabilitation, and as a form of communi

... Show More
View Publication
Scopus (1)
Scopus Crossref
Publication Date
Mon Jul 11 2022
Journal Name
International Journal Of Online And Biomedical Engineering (ijoe)
Dynamic Background Subtraction in Video Surveillance Using Color-Histogram and Fuzzy C-Means Algorithm with Cosine Similarity
...Show More Authors

The background subtraction is a leading technique adopted for detecting the moving objects in video surveillance systems. Various background subtraction models have been applied to tackle different challenges in many surveillance environments. In this paper, we propose a model of pixel-based color-histogram and Fuzzy C-means (FCM) to obtain the background model using cosine similarity (CS) to measure the closeness between the current pixel and the background model and eventually determine the background and foreground pixel according to a tuned threshold. The performance of this model is benchmarked on CDnet2014 dynamic scenes dataset using statistical metrics. The results show a better performance against the state-of the art

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Clarivate Crossref