Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering inside random text. In the third scenario the encryption process insures a correct restoration of original message. Experimental results show that the proposed cryptosystem works well and secure due to the huge number of fingerprints may be used by attacker to attempt message extraction where all fingerprints but one will give incorrect results and the message will not represent original plain-text, also this method ensures that any intended tamper or simple damage will be discovered due to failure in extracting proper message even if the correct fingerprint are used.
This paper aims to propose a hybrid approach of two powerful methods, namely the differential transform and finite difference methods, to obtain the solution of the coupled Whitham-Broer-Kaup-Like equations which arises in shallow-water wave theory. The capability of the method to such problems is verified by taking different parameters and initial conditions. The numerical simulations are depicted in 2D and 3D graphs. It is shown that the used approach returns accurate solutions for this type of problems in comparison with the analytic ones.
Abstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show Moreطريقة سهلة وبسيطة ودقيقة لتقدير السبروفلوكساسين في وجود السيفاليكسين او العكس بالعكس في خليط منهما. طبقت الطريقة المقترحة بطريقة الاضافة القياسية لنقطة بنجاح في تقدير السبروفلوكساسين بوجود السيفاليكسين كمتداخل عند الاطوال الموجية 240-272.3 نانوميتر وبتراكيز مختلفة من السبروفلوكساسين 4-18 مايكروغرام . مل-1 وكذلك تقدير السيفاليكسين بوجود السبروفلوكساسين الذي يتداخل باطوال موجية 262-285.7 نانوميتر وبتراكيز مخ
... Show MoreThe use of real-time machine learning to optimize passport control procedures at airports can greatly improve both the efficiency and security of the processes. To automate and optimize these procedures, AI algorithms such as character recognition, facial recognition, predictive algorithms and automatic data processing can be implemented. The proposed method is to use the R-CNN object detection model to detect passport objects in real-time images collected by passport control cameras. This paper describes the step-by-step process of the proposed approach, which includes pre-processing, training and testing the R-CNN model, integrating it into the passport control system, and evaluating its accuracy and speed for efficient passenger flow
... Show MoreSocial networking sites represent one of the modern communication technologies that have contributed to the expression of public opinion trends towards various events and crises of which security crisis is most important being characterized by its ability to influence the community life of the public. In order to recognize its role in shaping opinions of the educated class of the public that is characterized by a high level of knowledge, culture and having experience in dealing with the media. Its advantage is that they have an active audience by expressing their views on the situations, events, and news published on them as well as expressing their attitudes and sympathy with the events. So a number of questions are included in the ques
... Show MoreThe research involved a rapid, automated and highly accurate developed CFIA/MZ technique for estimation of phenylephrine hydrochloride (PHE) in pure, dosage forms and biological sample. This method is based on oxidative coupling reaction of 2,4-dinitrophenylhydrazine (DNPH) with PHE in existence of sodium periodate as oxidizing agent in alkaline medium to form a red colored product at ʎmax )520 nm (. A flow rate of 4.3 mL.min-1 using distilled water as a carrier, the method of FIA proved to be as a sensitive and economic analytical tool for estimation of PHE.
Within the concentration range of 5-300 μg.mL-1, a calibration curve was rectilinear, where the detection limit was 3.252 μg.mL
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.