Face recognition and identity verification are now critical components of current security and verification technology. The main objective of this review is to identify the most important deep learning techniques that have contributed to the improvement in the accuracy and reliability of facial recognition systems, as well as highlighting existing problems and potential future research areas. An extensive literature review was conducted with the assistance of leading scientific databases such as IEEE Xplore, ScienceDirect, and SpringerLink and covered studies from the period 2015 to 2024. The studies of interest were related to the application of deep neural networks, i.e., CNN, Siamese, and Transformer-based models, in face recognition and identity verification systems. Deep learning-based approaches have been shown through cross-sectional studies to improve recognition accuracy under diverse environmental and demographic conditions. Anti-counterfeiting (Anti-Spoofing) and real presence detection features integrated into systems have likewise enhanced system security against advanced attacks such as 3D masks, false images and videos, and Deepfake technology. Future trends point to the need to develop deep, multi-sensory and interpretable learning models, and adopt learning strategies based on limited data, while adhering to legal and ethical frameworks to ensure fairness andtransparency.
Document clustering is the process of organizing a particular electronic corpus of documents into subgroups of similar text features. Formerly, a number of conventional algorithms had been applied to perform document clustering. There are current endeavors to enhance clustering performance by employing evolutionary algorithms. Thus, such endeavors became an emerging topic gaining more attention in recent years. The aim of this paper is to present an up-to-date and self-contained review fully devoted to document clustering via evolutionary algorithms. It firstly provides a comprehensive inspection to the document clustering model revealing its various components with its related concepts. Then it shows and analyzes the principle research wor
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Progression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreThe futuristic age requires progress in handwork or even sub-machine dependency and Brain-Computer Interface (BCI) provides the necessary BCI procession. As the article suggests, it is a pathway between the signals created by a human brain thinking and the computer, which can translate the signal transmitted into action. BCI-processed brain activity is typically measured using EEG. Throughout this article, further intend to provide an available and up-to-date review of EEG-based BCI, concentrating on its technical aspects. In specific, we present several essential neuroscience backgrounds that describe well how to build an EEG-based BCI, including evaluating which signal processing, software, and hardware techniques to use. Individu
... Show MoreIn this paper, a cognitive system based on a nonlinear neural controller and intelligent algorithm that will guide an autonomous mobile robot during continuous path-tracking and navigate over solid obstacles with avoidance was proposed. The goal of the proposed structure is to plan and track the reference path equation for the autonomous mobile robot in the mining environment to avoid the obstacles and reach to the target position by using intelligent optimization algorithms. Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithms are used to finding the solutions of the mobile robot navigation problems in the mine by searching the optimal paths and finding the reference path equation of the optimal
... Show MoreNanoparticles are defined as an organic or non-organic structure of matter in at least one of its dimensions less than 100 nm. Nanoparticles proved their effectiveness in different fields because of their unique physicochemical properties. Using nanoparticles in the power field contributes to cleaning and decreasing environmental pollution, which means it is an environmentally friendly material. It could be used in many different parts of batteries, including an anode, cathode, and electrolyte. This study reviews different types of nanoparticles used in Lithium-ion batteries by collecting the advanced techniques for applying nanotechnology in batteries. In addition, this review presents an idea about the advantages and d
... Show More