Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
The exchanges in various fields,like economics, science, culture, etc., have been enhanced unceasingly among different countries around the world in the twenty-first century, thus, the university graduate who masters one foreign language does not meet the need of the labor market in most countries.So, many universities began to develop new programs to cultivate students who can use more foreign languages to serve the intercultural communication. At the same time, there is more scientific research emerged which is related to the relationship between the second and third languages. This humble research seeks to explain the relevant concepts and analyze the real data collected from Shanghai International Studies University in China, to expl
... Show MoreAs technology advances and develops, the need for strong and simple authentication mechanisms that can help protect data intensifies. The contemporary approach to giving access control is through graphical passwords comprising images, patterns, or graphical items. The objective of this review was to determine the documented security risks that are related to the use of graphical passwords, together with the measures that have been taken to prevent them. The review was intended to present an extensive literature review of the subject matter on graphical password protection and to point toward potential future research directions. Many attacks, such as shoulder surfing attacks, SQL injection attacks, and spyware attacks, can easily ex
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreVol. 6, Issue 1 (2025)
This paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreOne of the most serious health disasters in recent memory is the COVID-19 epidemic. Several restriction rules have been forced to reduce the virus spreading. Masks that are properly fitted can help prevent the virus from spreading from the person wearing the mask to others. Masks alone will not protect against COVID-19; they must be used in conjunction with physical separation and avoidance of direct contact. The fast spread of this disease, as well as the growing usage of prevention methods, underscore the critical need for a shift in biometrics-based authentication schemes. Biometrics systems are affected differently depending on whether are used as one of the preventive techniques based on COVID-19 pandemic rules. This study provides an
... Show More