This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreStatistical learning theory serves as the foundational bedrock of Machine learning (ML), which in turn represents the backbone of artificial intelligence, ushering in innovative solutions for real-world challenges. Its origins can be linked to the point where statistics and the field of computing meet, evolving into a distinct scientific discipline. Machine learning can be distinguished by its fundamental branches, encompassing supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. Within this tapestry, supervised learning takes center stage, divided in two fundamental forms: classification and regression. Regression is tailored for continuous outcomes, while classification specializes in c
... Show MoreComputer systems and networks are being used in almost every aspect of our daily life, the security threats to computers and networks have increased significantly. Usually, password-based user authentication is used to authenticate the legitimate user. However, this method has many gaps such as password sharing, brute force attack, dictionary attack and guessing. Keystroke dynamics is one of the famous and inexpensive behavioral biometric technologies, which authenticate a user based on the analysis of his/her typing rhythm. In this way, intrusion becomes more difficult because the password as well as the typing speed must match with the correct keystroke patterns. This thesis considers static keystroke dynamics as a transparent layer of t
... Show MoreThis paper presents the electrical behavior of the top contact/ bottom gate of an organic field-effect transistor (OFET) utilizing Pentacene as a semiconductor layer with two distinctive gate dielectric materials Polyvinylpyrrolidone (PVP) and Zirconium oxide (ZrO2) were chosen. The influence of the monolayer and bilayer gates insulator on OFET performance was investigated. MATLAB software was used to simulate and determine the electrical characteristics of a device. The output and transfer characteristics were studied for ZrO2, PVP and ZrO2/PVP as an organic gate insulator layer. Both characteristics show a high drain current at the gate dielectric ZrO2/PVP equal to -0.0031A and -0.0015A for output and transfer characteristics respectively
... Show MoreThis investigation presents an experimental and analytical study on the behavior of reinforced concrete deep beams before and after repair. The original beams were first loaded under two points load up to failure, then, repaired by epoxy resin and tested again. Three of the test beams contains shear reinforcement and the other two beams have no shear reinforcement. The main variable in these beams was the percentage of longitudinal steel reinforcement (0, 0.707, 1.061, and 1.414%). The main objective of this research is to investigate the possibility of restoring the full load carrying capacity of the reinforced concrete deep beam with and without shear reinforcement by using epoxy resin as the material of repair. All be
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show More