One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures, it will be more difficult to attain greater verification accuracy. On the other hand, the characteristics of Arabic signatures are not very clear and are subject to a great deal of variation (features’ uncertainty). To address this issue, the suggested work offers a novel method of verifying offline Arabic signatures that employs two layers of verification, as opposed to the one level employed by prior attempts or the many classifiers based on statistical learning theory. A static set of signature features is used for layer one verification. The output of a neutrosophic logic module is used for layer two verification, with the accuracy depending on the signature characteristics used in the training dataset and on three membership functions that are unique to each signer based on the degree of truthiness, indeterminacy, and falsity of the signature features. The three memberships of the neutrosophic set are more expressive for decision-making than those of the fuzzy sets. The purpose of the developed model is to account for several kinds of uncertainty in describing Arabic signatures, including ambiguity, inconsistency, redundancy, and incompleteness. The experimental results show that the verification system works as intended and can successfully reduce the FAR and FRR.
Visible light communication (VLC) is an upcoming wireless technology for next-generation communication for high-speed data transmission. It has the potential for capacity enhancement due to its characteristic large bandwidth. Concerning signal processing and suitable transceiver design for the VLC application, an amplification-based optical transceiver is proposed in this article. The transmitter consists of a driver and laser diode as the light source, while the receiver contains a photodiode and signal amplifying circuit. The design model is proposed for its simplicity in replacing the trans-impedance and transconductance circuits of the conventional modules by a simple amplification circuit and interface converter. Th
... Show MorePMMA (Poly methyl methacrylate) is considered one of the most commonly used materials in denture base fabrication due to its ideal properties. Although, a major problem with this resin is the frequent fractures due to heavy chewing forces which lead to early crack and fracture in clinical use. The addition of nanoparticles as filler performed in this study to enhance its selected mechanical properties. The Nano-additive effect investigated in normal circumstances and under a different temperature during water exposure. First, tests applied on the prepared samples at room temperature and then after exposure to water bath at (20, 40, 60) C° respectively. SEM, PSD, EDX were utilized for samples evaluation in this study. Flexural
... Show MoreIn this research various of 2,5-disubstituted 1,3,4-oxadiazole (Schiff base, oxo-thiazolidine , and other compounds) were synthesized from 2,5-di(4,4?- amino-1,3,4-oxadiazole ) which use quently synthesized from mixture of 4-amino benzoic acid and hydrazine in the presence of polyphosphorus acid. The synthesized compounds were characterized by using some Spectral data (UV, FT-IR, and 1H-NMR).
The aim of the current study was to develop a nanostructured double-layer for hydrophobic molecules delivery system. The developed double-layer consisted of polyethylene glycol-based polymeric (PEG) followed by gelatin sub coating of the core hydrophobic molecules containing sodium citrate. The polymeric composition ratio of PEG and the amount of the sub coating gelatin were optimized using the two-level fractional method. The nanoparticles were characterized using AFM and FT-IR techniques. The size of these nano capsules was in the range of 39-76 nm depending on drug loading concentration. The drug was effectively loaded into PEG-Gelatin nanoparticles (≈47%). The hydrophobic molecules-release characteristics in terms of controlled-releas
... Show MoreThe widespread use of the Internet of things (IoT) in different aspects of an individual’s life like banking, wireless intelligent devices and smartphones has led to new security and performance challenges under restricted resources. The Elliptic Curve Digital Signature Algorithm (ECDSA) is the most suitable choice for the environments due to the smaller size of the encryption key and changeable security related parameters. However, major performance metrics such as area, power, latency and throughput are still customisable and based on the design requirements of the device.
The present paper puts forward an enhancement for the throughput performance metric by p
... Show MoreThe obligatory duty must be fulfilled for attachment to the discharge, whatever the reasons for the missingness, whether it is left by mistake or omission or deliberate, whether with or without excuse. What is likely in this research to spend Ramadan on laxity.
The one who delayed making up Ramadaan fasts until he realized another Ramadaan, fasted the second and spent the first after him, and no ransom on him, either with an excuse or without an excuse.
A Multiple System Biometric System Based on ECG Data
This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of
... Show More