Biometrics is widely used with security systems nowadays; each biometric modality can be useful and has distinctive properties that provide uniqueness and ambiguity for security systems especially in communication and network technologies. This paper is about using biometric features of fingerprint, which is called (minutiae) to cipher a text message and ensure safe arrival of data at receiver end. The classical cryptosystems (Caesar, Vigenère, etc.) became obsolete methods for encryption because of the high-performance machines which focusing on repetition of the key in their attacks to break the cipher. Several Researchers of cryptography give efforts to modify and develop Vigenère cipher by enhancing its weaknesses. The proposed method uses local feature of fingerprint represented by minutiae positions to overcome the problem of repeated key to perform encryption and decryption of a text message, where, the message will be ciphered by a modified Vigenère method. Unlike the old usual method, the key constructed from fingerprint minutiae depend on instantaneous date and time of ciphertext generation. The Vigenère table consist of 95 elements: case sensitive letters, numbers, symbols and punctuation. The simulation results (with MATLAB 2021b) show that the original message cannot be reconstructed without the presence of the key which is a function of the date and time of generation. Where 720 different keys can be generated per day which mean 1440 distinct ciphertexts can be obtained for the same message daily.
The modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MoreUser confidentiality protection is concerning a topic in control and monitoring spaces. In image, user's faces security in concerning with compound information, abused situations, participation on global transmission media and real-world experiences are extremely significant. For minifying the counting needs for vast size of image info and for minifying the size of time needful for the image to be address computationally. consequently, partial encryption user-face is picked. This study focuses on a large technique that is designed to encrypt the user's face slightly. Primarily, dlib is utilizing for user-face detection. Susan is one of the top edge detectors with valuable localization characteristics marked edges, is used to extract
... Show MoreIn the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show MorePredicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and
... Show MoreIntrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system
... Show MoreIn this article, we define and study a family of modified Baskakov type operators based on a parameter . This family is a generalization of the classical Baskakov sequence. First, we prove that it converges to the function being approximated. Then, we find a Voronovsky-type formula and obtain that the order of approximation of this family is . This order is better than the order of the classical Baskakov sequence whenever . Finally, we apply our sequence to approximate two test functions and analyze the numerical results obtained.
The primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreMultiple applications use offline handwritten signatures for human verification. This fact increases the need for building a computerized system for signature recognition and verification schemes to ensure the highest possible level of security from counterfeit signatures. This research is devoted to developing a system for offline signature verification based on a combination of local ridge features and other features obtained from applying two-level Haar wavelet transform. The proposed system involves many preprocessing steps that include a group of image processing techniques (including: many enhancement techniques, region of interest allocation, converting to a binary image, and Thinning). In feature extraction and
... Show More