In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compared to traditional image filtering techniques. This paper aimed to utilize a specific CNN architecture known as AlexNet for the fingerprint-matching task. Using such an architecture, this study has extracted the significant features of the fingerprint image, generated a key based on such a biometric feature of the image, and stored it in a reference database. Then, using Cosine similarity and Hamming Distance measures, the testing fingerprints have been matched with a reference. Using the FVC2002 database, the proposed method showed a False Acceptance Rate (FAR) of 2.09% and a False Rejection Rate (FRR) of 2.81%. Comparing these results against other studies that utilized traditional approaches such as the Fuzzy Vault has demonstrated the efficacy of CNN in terms of fingerprint matching. It is also emphasizing the usefulness of using Cosine similarity and Hamming Distance in terms of matching.
Recently, the phenomenon of the spread of fake news or misinformation in most fields has taken on a wide resonance in societies. Combating this phenomenon and detecting misleading information manually is rather boring, takes a long time, and impractical. It is therefore necessary to rely on the fields of artificial intelligence to solve this problem. As such, this study aims to use deep learning techniques to detect Arabic fake news based on Arabic dataset called the AraNews dataset. This dataset contains news articles covering multiple fields such as politics, economy, culture, sports and others. A Hybrid Deep Neural Network has been proposed to improve accuracy. This network focuses on the properties of both the Text-Convolution Neural
... Show MoreA .technology analysis image using crops agricultural of grading and sorting the test to conducted was experiment The device coupling the of sensor a with camera a and 75 * 75 * 50 dimensions with shape cube studio made-factory locally the study to studio the in taken were photos and ,)blue-green - red (lighting triple with equipped was studio The .used were neural artificial and technology processing image using maturity and quality ,damage of fruits the of characteristics external value the quality 0.92062, of was value regression the damage predict to used was network neural artificial The .network the using scheme regression a of means by 0.98654 of was regression the of maturity and 0.97981 of was regression the of .algorithm Marr
... Show MoreBoltzmann mach ine neural network bas been used to recognize the Arabic speech. Fast Fourier transl(>lmation algorithm has been used t() extract speciral 'features from an a caustic signal .
The spectral feature size is reduced by series of operations in
order to make it salable as input for a neural network which is used as a recogni zer by Boltzmann Machine Neural network which has been used as a recognizer for phonemes . A training set consist of a number of Arabic phoneme repesentations, is used to train lhe neuntl network.
The neural network recognized Arabic. After Boltzmann Machine Neura l network training the system with
... Show MoreImproved Merging Multi Convolutional Neural Networks Framework of Image Indexing and Retrieval
The dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s
... Show MoreWireless Body Area Network (WBAN) is a tool that improves real-time patient health observation in hospitals, asylums, especially at home. WBAN has grown popularity in recent years due to its critical role and vast range of medical applications. Due to the sensitive nature of the patient information being transmitted through the WBAN network, security is of paramount importance. To guarantee the safe movement of data between sensor nodes and various WBAN networks, a high level of security is required in a WBAN network. This research introduces a novel technique named Integrated Grasshopper Optimization Algorithm with Artificial Neural Network (IGO-ANN) for distinguishing between trusted nodes in WBAN networks by means of a classifica
... Show MoreThe present article delves into the examination of groundwater quality, based on WQI, for drinking purposes in Baghdad City. Further, for carrying out the investigation, the data was collected from the Ministry of Water Resources of Baghdad, which represents water samples drawn from 114 wells in Al-Karkh and Al-Rusafa sides of Baghdad city. With the aim of further determining WQI, four water parameters such as (i) pH, (ii) Chloride (Cl), (iii) Sulfate (SO4), and (iv) Total dissolved solids (TDS), were taken into consideration. According to the computed WQI, the distribution of the groundwater samples, with respect to their quality classes such as excellent, good, poor, very poor and unfit for human drinking purpose, was found to be
... Show MoreThe paper proposes a methodology for predicting packet flow at the data plane in smart SDN based on the intelligent controller of spike neural networks(SNN). This methodology is applied to predict the subsequent step of the packet flow, consequently reducing the overcrowding that might happen. The centralized controller acts as a reactive controller for managing the clustering head process in the Software Defined Network data layer in the proposed model. The simulation results show the capability of Spike Neural Network controller in SDN control layer to improve the (QoS) in the whole network in terms of minimizing the packet loss ratio and increased the buffer utilization ratio.
Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.