Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
The study aimed to identify the treatment of the press image of the Great Return Marches in the French international news agency AFP by knowing the most important issues, their direction and the degree of interest in them. The study belongs to the descriptive research, and used the survey method, within the context of the content analysis method, and the researcher relied on the content analysis form tool and the interview tool to collect data. The study population is represented in the photos published by the French News Agency about the Great Return Marches during the period (end of March / 2018 until the end of November / 2019. The researcher chose an intentional sample using the Complete Census method. The study material represented
... Show MoreThe aim the research that definition on the impact a lot of Analysis and evaluation jobs impact in support the employees performance the property that are Analysis and evaluation jobs is one of the jobs however of the human resource management on organization and the impact footpace big on the chractericties and performance of the people and the impact that success of the organization , And here problem stool of the research in the omission the role for the Analysis and evaluation jobs impact in support the employees performance from the upward management in the organization , Polls were adopted as tools for obtaining data and the Depart
... Show MoreThis work analyzes the effectiveness of an artificial intelligence (AI) community- building workshop designed for high school teachers and it focuses on contemporary issues related to AI concepts and applications. A group of high school teachers from local education districts attended a one-day AI hands-on workshop at our university. The workshop included several AI-related topics and hands-on examples and exercises aiming to introduce AI concepts and tools relevant to pre-college education. The participating teachers were expected to become a part of a collaborative network created to design, develop, and implement novel AI learning modules for high school students. Initial and a post-training surveys have been used to measure the
... Show MoreThe maximization of the net present value of the investment in oil field improvements is greatly aided by the optimization of well location, which plays a significant role in the production of oil. However, using of optimization methods in well placement developments is exceedingly difficult since the well placement optimization scenario involves a large number of choice variables, objective functions, and restrictions. In addition, a wide variety of computational approaches, both traditional and unconventional, have been applied in order to maximize the efficiency of well installation operations. This research demonstrates how optimization approaches used in well placement have progressed since the last time they were examined. Fol
... Show MoreThe research aims to indicate the relationship between lean production tools included seven {constant improvement , and Just in time (JIT), and the production smoothing , and quality at the source, and standardized work, Visual management, and activities 5S } and Mass Customization strategy for the model (Pine & Gilomer, 1997) {collaborative, adaptive, cosmetic, transparent}, as well as providing a conceptual framework and applied for variables search to clarify how they will choose a Mass Customization strategy through the lean production tools, , and recognize the reality of the practices of Iraqi industries in such a field. Moreover, aims to highlight the positive aspects that accrue to companies a
... Show MoreThe Environmental Data Acquisition Telemetry System is a versatile, flexible and economical means to accumulate data from multiple sensors at remote locations over an extended period of time; the data is normally transferred to the final destination and saved for further analysis.
This paper introduces the design and implementation of a simplified, economical and practical telemetry system to collect and transfer the environmental parameters (humidity, temperature, pressure etc.) from a remote location (Rural Area) to the processing and displaying unit.
To get a flexible and practical system, three data transfer methods (three systems) were proposed (including the design and implementation) for rural area services, the fi
... Show MoreIn present days, drug resistance is a major emerging problem in the healthcare sector. Novel antibiotics are in considerable need because present effective treatments have repeatedly failed. Antimicrobial peptides are the biologically active secondary metabolites produced by a variety of microorganisms like bacteria, fungi, and algae, which possess surface activity reduction activity along with this they are having antimicrobial, antifungal, and antioxidant antibiofilm activity. Antimicrobial peptides include a wide variety of bioactive compounds such as Bacteriocins, glycolipids, lipopeptides, polysaccharide-protein complexes, phospholipids, fatty acids, and neutral lipids. Bioactive peptides derived from various natural sources like bacte
... Show More