One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures, it will be more difficult to attain greater verification accuracy. On the other hand, the characteristics of Arabic signatures are not very clear and are subject to a great deal of variation (features’ uncertainty). To address this issue, the suggested work offers a novel method of verifying offline Arabic signatures that employs two layers of verification, as opposed to the one level employed by prior attempts or the many classifiers based on statistical learning theory. A static set of signature features is used for layer one verification. The output of a neutrosophic logic module is used for layer two verification, with the accuracy depending on the signature characteristics used in the training dataset and on three membership functions that are unique to each signer based on the degree of truthiness, indeterminacy, and falsity of the signature features. The three memberships of the neutrosophic set are more expressive for decision-making than those of the fuzzy sets. The purpose of the developed model is to account for several kinds of uncertainty in describing Arabic signatures, including ambiguity, inconsistency, redundancy, and incompleteness. The experimental results show that the verification system works as intended and can successfully reduce the FAR and FRR.
At the level of both individuals and companies, Wireless Sensor Networks (WSNs) get a wide range of applications and uses. Sensors are used in a wide range of industries, including agriculture, transportation, health, and many more. Many technologies, such as wireless communication protocols, the Internet of Things, cloud computing, mobile computing, and other emerging technologies, are connected to the usage of sensors. In many circumstances, this contact necessitates the transmission of crucial data, necessitating the need to protect that data from potential threats. However, as the WSN components often have constrained computation and power capabilities, protecting the communication in WSNs comes at a significant performance pena
... Show MoreFree-Space Optical (FSO) can provide high-speed communications when the effect of turbulence is not serious. However, Space-Time-Block-Code (STBC) is a good candidate to mitigate this seriousness. This paper proposes a hybrid of an Optical Code Division Multiple Access (OCDMA) and STBC in FSO communication for last mile solutions, where access to remote areas is complicated. The main weakness effecting a FSO link is the atmospheric turbulence. The feasibility of employing STBC in OCDMA is to mitigate these effects. The current work evaluates the Bit-Error-Rate (BER) performance of OCDMA operating under the scintillation effect, where this effect can be described by the gamma-gamma model. The most obvious finding to emerge from the analysis
... Show MoreTechnological advances have yielded new molecular biology-based methods for the diagnosis of infectious diseases. The newest and most powerful molecular diagnostic tests are available at regional and national reference laboratories, as well as at specialized centers that are certified to conduct metagenomic testing. Metagenomic assays utilize advances in DNA extraction technology, DNA sequence library construction, high throughput DNA sequencing and automated data analysis to identify millions of individual strands of DNA extracted from clinical samples. At present, metagenomic assays are only possible at a small number of special research, academic and commercial laboratories. Continued research in human and path
... Show MoreAZ Khalaf, M kassim Haidir, LK Jasim, Iraqi Journal of Science, 2012
This research deals with increasing the hardening and insulating the petroleum pipes against the conditions and erosion of different environments. So, basic material of epoxy has been mixed with Ceramic Nano Zirconia reinforcement material 35 nm with the percentages (0,1,2,3,4,5) %, whereas the paint basis of broken petroleum pipes was used to paint on it, then it was cut into dimensions (2 cm. × 2 cm.) and 0.3cm high. After the paint and percentages are completed, the samples were immersed into the paint. Then, the micro-hardness was checked according to Vickers method and thermal inspection of paint, which contained (Thermal conduction, thermal flux and Thermal diffusivity), the density of the painted samples was calculate
... Show MoreRap songs often feature artists who utilize explicit language to convey feelings such as happiness, sorrow, and anger, reflecting audience expectations and trends within the music industry. This study intends to conduct a socio-pragmatic analysis of explicit, derogatory, and offensive language in the songs of the American artist Doja Cat, employing Hughes’ (1996) Swearing Word Theory, Jay’s (1996) Taboo Words Theory, Luhr’s (2002) classification of social factors for sociolinguistic examination, Salager’s (1997) categories of hedges for pragmatic assessment, and Austin’s (1965, 1989) theory of speech acts. The researchers collected the data using the AntConc corpus analysis tool. The data shows the singer’s frequent use
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreThe development of microcontroller is used in monitoring and data acquisition recently. This development has born various architectures for spreading and interfacing the microcontroller in network environment. Some of existing architecture suffers from redundant in resources, extra processing, high cost and delay in response. This paper presents flexible concise architecture for building distributed microcontroller networked system. The system consists of only one server, works through the internet, and a set of microcontrollers distributed in different sites. Each microcontroller is connected through the Ethernet to the internet. In this system the client requesting data from certain side is accomplished through just one server that is in
... Show More