One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures, it will be more difficult to attain greater verification accuracy. On the other hand, the characteristics of Arabic signatures are not very clear and are subject to a great deal of variation (features’ uncertainty). To address this issue, the suggested work offers a novel method of verifying offline Arabic signatures that employs two layers of verification, as opposed to the one level employed by prior attempts or the many classifiers based on statistical learning theory. A static set of signature features is used for layer one verification. The output of a neutrosophic logic module is used for layer two verification, with the accuracy depending on the signature characteristics used in the training dataset and on three membership functions that are unique to each signer based on the degree of truthiness, indeterminacy, and falsity of the signature features. The three memberships of the neutrosophic set are more expressive for decision-making than those of the fuzzy sets. The purpose of the developed model is to account for several kinds of uncertainty in describing Arabic signatures, including ambiguity, inconsistency, redundancy, and incompleteness. The experimental results show that the verification system works as intended and can successfully reduce the FAR and FRR.
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
This paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
The main risks arising from the WTO Agreement are the inequality and lack of competitiveness of most pharmaceutical goods, as well as the fact that Iraq is a net importer of medicines that are at the core of consumer needs, The subject matter of the Convention on the Protection of Intellectual Property Rights and its implications for the pharmaceutical industry, in particular, coinciding with the situation of financial and administrative corruption, all of which has resulted in drug fraud in the Iraqi market and its impact on public health. The control of medical technology, the persistence of the technological gap and its effects on high price levels, and the fact that domestic drug producers are obliged to obtain production licenses from
... Show MoreDiscriminant analysis is a technique used to distinguish and classification an individual to a group among a number of groups based on a linear combination of a set of relevant variables know discriminant function. In this research discriminant analysis used to analysis data from repeated measurements design. We will deal with the problem of discrimination and classification in the case of two groups by assuming the Compound Symmetry covariance structure under the assumption of normality for univariate repeated measures data.
... Show More
The Hopfield network is one of the easiest types, and its architecture is such that each neuron in the network connects to the other, thus called a fully connected neural network. In addition, this type is considered auto-associative memory, because the network returns the pattern immediately upon recognition, this network has many limitations, including memory capacity, discrepancy, orthogonally between patterns, weight symmetry, and local minimum. This paper proposes a new strategy for designing Hopfield based on XOR operation; A new strategy is proposed to solve these limitations by suggesting a new algorithm in the Hopfield network design, this strategy will increase the performance of Hopfield by modifying the architecture of t
... Show MoreSome new 2,5-disubsituted-1,3,4-oxadiazole derivatives with azo group were synthesized by known reactions sequence . The structure of the synthesized compounds were confirmed by physical and spectral means .
Background: The purpose of the current study was to evaluate the efficacy of a new orthodontic bonding system (Beauty Ortho Bond) involving the shear bond strength in dry and wet environments, and adhesion remnant index (ARI) scores evaluation in regard to other bonding systems (Heliosit and Resilience Orthodontic Adhesives). Materials and methods: Sixty defect free extracted premolars were randomly divided into six groups of 10 teeth each, mounted in acrylic resin, three groups for a dry environment and three for a wet one. Shear bond strength test was performed with a cross head speed of 0.5 mm/min, while surfaces of enamel and bracket-adhesive-enamel surfaces were examined with stereomicroscope For ARI scores evaluation. Data were analyz
... Show MoreVerrucae vulgares are commonly encountered. The present work is designed in an attempt to build a systematic procedure for treating warts by carbon dioxide laser regarding dose parameters, application parameters and laser safety.
Patients and Methods: The study done in the department of dermatology in Al-Najaf Teaching Hospital in Najaf, Iraq. Forty-two patients completed the study and follow up period for 3 months. Recalcitrant and extensive warts were selected to enter the study. Carbon dioxide laser in a continuous mode, in non-contact application, with 1 mm spot size was used. The patients were divided into two groups. The first group of patients consisted of 60 lesions divided to 6 equal groups, in whom we use different outputs a