One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures, it will be more difficult to attain greater verification accuracy. On the other hand, the characteristics of Arabic signatures are not very clear and are subject to a great deal of variation (features’ uncertainty). To address this issue, the suggested work offers a novel method of verifying offline Arabic signatures that employs two layers of verification, as opposed to the one level employed by prior attempts or the many classifiers based on statistical learning theory. A static set of signature features is used for layer one verification. The output of a neutrosophic logic module is used for layer two verification, with the accuracy depending on the signature characteristics used in the training dataset and on three membership functions that are unique to each signer based on the degree of truthiness, indeterminacy, and falsity of the signature features. The three memberships of the neutrosophic set are more expressive for decision-making than those of the fuzzy sets. The purpose of the developed model is to account for several kinds of uncertainty in describing Arabic signatures, including ambiguity, inconsistency, redundancy, and incompleteness. The experimental results show that the verification system works as intended and can successfully reduce the FAR and FRR.
Huge yearly investments were made by organizations for the development and maintenance. However, it has been reported that most of the IT projects fails as it is delayed, over budget and discontinued quality. A systematic literature review (SLR) was conducted to identify the critical success factors (CSFs) for the IT projects. Nine (9) CSFs was identified from the SLR. An online survey was conducted among 103 respondents from developers and IT managers. The data was analyzed using the Statistical Package for Social Science (SPSS 22). The findings showed that the highest CSFs of IT projects is commitment and motivation. Project monitoring was found the lowest score ranked by respondents.
Spatial and frequency domain techniques have been adopted in this search. mean
value filter, median filter, gaussian filter. And adaptive technique consists of
duplicated two filters (median and gaussian) to enhance the noisy image. Different
block size of the filter as well as the sholding value have been tried to perform the
enhancement process.
In this paper, an Anti-Disturbance Compensator is suggested for the stabilization of a 6-DoF quadrotor Unmanned Aerial vehicle (UAV) system, namely, the Improved Active Disturbance Rejection Control (IADRC). The proposed Control Scheme rejects the disturbances subjected to this system and eliminates the effect of the uncertainties that the quadrotor system exhibits. The complete nonlinear mathematical model of the 6-DoF quadrotor UAV system has been used to design the four ADRCs units for the attitude and altitude stabilization. Stability analysis has been demonstrated for the Linear Extended State Observer (LESO) of each IADRC unit and the overall closed-loop system using Hurwitz stability criterion. A minimization to a
... Show MoreAbstract
For sparse system identification,recent suggested algorithms are
-norm Least Mean Square (
-LMS), Zero-Attracting LMS (ZA-LMS), Reweighted Zero-Attracting LMS (RZA-LMS), and p-norm LMS (p-LMS) algorithms, that have modified the cost function of the conventional LMS algorithm by adding a constraint of coefficients sparsity. And so, the proposed algorithms are named
-ZA-LMS,
The Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show MoreIn this research we solved numerically Boltzmann transport equation in order to calculate the transport parameters, such as, drift velocity, W, D/? (ratio of diffusion coefficient to the mobility) and momentum transfer collision frequency ?m, for purpose of determination of magnetic drift velocity WM and magnetic deflection coefficient ? for low energy electrons, that moves in the electric field E, crossed with magnetic field B, i.e; E×B, in the nitrogen, Argon, Helium and it's gases mixtures as a function of: E/N (ratio of electric field strength to the number density of gas), E/P300 (ratio of electric field strength to the gas pressure) and D/? which covered a different ranges for E/P300 at temperatures 300°k (Kelvin). The results show
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreThe present work involved synthesis of serval new substituted tetrazole via Schiff bases for trimethoprim drug by two steps. The first step involved direct reaction of different ketones and aldehydes with trimethoprim producing the corresponding Schiff bases (1-10), whereas the second step, involved preparation new tetrazoles derivatives (11-20) through reaction of the ready Schiff bases (in the first step) with sodium azidein in dioxin. The prepared compounds were characterized by UV, FT-IR, and some of them by 13C-NMR, 1H-NMR spectroscopy and physical properties.