In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compared to traditional image filtering techniques. This paper aimed to utilize a specific CNN architecture known as AlexNet for the fingerprint-matching task. Using such an architecture, this study has extracted the significant features of the fingerprint image, generated a key based on such a biometric feature of the image, and stored it in a reference database. Then, using Cosine similarity and Hamming Distance measures, the testing fingerprints have been matched with a reference. Using the FVC2002 database, the proposed method showed a False Acceptance Rate (FAR) of 2.09% and a False Rejection Rate (FRR) of 2.81%. Comparing these results against other studies that utilized traditional approaches such as the Fuzzy Vault has demonstrated the efficacy of CNN in terms of fingerprint matching. It is also emphasizing the usefulness of using Cosine similarity and Hamming Distance in terms of matching.
Two modes of electrochemical harvesting for microalgae were investigated in the current work. A sacrificial anode (aluminum) was used to study the electrocoagulation-flotation process, and a nonsacrificial anode (graphite) was used to investigate the electroflotation process. The study inspected the effect of chloride ions concentration and the interelectrode distance on the performance of the electrochemical harvesting processes. The results demonstrated that both electrodes achieved maximum harvesting efficiency with a 2 g/L NaCl concentration. Interestingly, by increasing the NaCl concentration to 5 g/L, the harvesting efficiency reduced dramatically to its lowest value. Generally, the energy consumption decreased with increasing
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreThis article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreResearch on the role of organizational change in easing the organizational conflict focuses for being one of the important topics and relatively modern and which have a significant impact on the future of organizations, so this study was to identify the relationship and the impact of organizational change and of deportation (technological, organizational structure, human resources, the change in the task) at the organizational conflict in the Earth company link Iraq, in order to reach the goals of the research, it has been the development of a questionnaire distributed to a random sample of (100) composed employees from managers and heads of departments and the people and staff at the Earth company link Iraq, the study found: the
... Show MoreThe continuous advancement in the use of the IoT has greatly transformed industries, though at the same time it has made the IoT network vulnerable to highly advanced cybercrimes. There are several limitations with traditional security measures for IoT; the protection of distributed and adaptive IoT systems requires new approaches. This research presents novel threat intelligence for IoT networks based on deep learning, which maintains compliance with IEEE standards. Interweaving artificial intelligence with standardization frameworks is the goal of the study and, thus, improves the identification, protection, and reduction of cyber threats impacting IoT environments. The study is systematic and begins by examining IoT-specific thre
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
This study aimed at an analytical comparison of the Internal Auditing Standards issued by the Institute of Internal Auditors (IIA) and the Guidance Manual for Audit Units issued by the Federal Audit Bureau to show the compatibility and differences between them and the possibility of applying the IIA standards to economic units in Iraq. The guideline was generally not covered by all the internal audit units. There is a lack of keeping pace with changes in internal auditing at the international level and there is a need to strengthen the Guideline on Internal Auditing Standards II A), which is characterized by the preparation of an internal document containing the objectives, powers and responsibilities of the internal audit work as well a
... Show MoreMost recent studies have focused on using modern intelligent techniques spatially, such as those
developed in the Intruder Detection Module (IDS). Such techniques have been built based on modern
artificial intelligence-based modules. Those modules act like a human brain. Thus, they should have had the
ability to learn and recognize what they had learned. The importance of developing such systems came after
the requests of customers and establishments to preserve their properties and avoid intruders’ damage. This
would be provided by an intelligent module that ensures the correct alarm. Thus, an interior visual intruder
detection module depending on Multi-Connect Architecture Associative Memory (MCA)
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show More