Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
Generally fossil based fuels are used in internal combustion engines as an energy source.
Excessive use of fossil based fuels diminishes present reserves and increases the air pollution in
urban areas. This enhances the importance of the effective use of present reserves and/or to develop
new alternative fuels, which are environment friendly. Use of alternative fuel is a way of emission
control. The term “Alternative Gaseous Fuels” relates to a wide range of fuels that are in the
gaseous state at ambient conditions, whether when used on their own or as components of mixtures
with other fuels.
In this study, a single cylinder diesel engine was modified to use LPG in dual fuel mode to study
the performance, emis
This study investigates the Linguistic and Conceptual equivalence of Conner’s Revised Scales when applied on a Sudanese sample. Sudanese parents and teachers completed behavior-rating scales on a stratified sample of 200 children. These instruments were based on Conner’s parent -48 and teacher-28 questionnaires. Following a reliable translation into Sudanese Arabic the test-retest reliability of the items and the internal consistency of the original Conner’s' revised scales were explored. The associations between scale scores and between parents and teachers scores were also examined. Both instruments displayed good reliability and the original Conners scales had satisfactory internal consistency. The inter-correlation sugg
... Show MoreA nonlinear filter for smoothing color and gray images
corrupted by Gaussian noise is presented in this paper. The proposed
filter designed to reduce the noise in the R,G, and B bands of the
color images and preserving the edges. This filter applied in order to
prepare images for further processing such as edge detection and
image segmentation.
The results of computer simulations show that the proposed
filter gave satisfactory results when compared with the results of
conventional filters such as Gaussian low pass filter and median filter
by using Cross Correlation Coefficient (ccc) criteria.
This study aims at suggesting flow as a strategy for training female EFL student-teachers in the teaching training course and finding out the effect of this strategy on their performance and their flow state. The training course syllabuses will be constructed according to the flow nine factors and the teaching skills. The measurement tools are the student-teacher performance checklist that has already been used by the department of English language and SHORT Flow State Scale (S FSS-2). The study population is represented with the (60) female student-teachers/ fourth stage/ evening studies at theEnglish department /college of education for women/the University of Baghdad. The study is used the experimental design in that (30) of the student-
... Show MoreThe current study presents an experimental investigation of heat transfer and flow characteristic for subcooled flow boiling of deionized water in the microchannel heat sink. The test section consisted of a single microchannel having 300μm wide nominal dimensions and 300μm height (hydraulic diameter of 300μm). The test section formed of oxygen-free copper with 72mm length and 12mm width. Experimental operation conditions spanned the heat flux (78-800) kW/m2, mass flux (1700 and 2100) kg/m2.s at 31˚C subcooled inlet temperature. The boiling heat transfer coefficient is measured and compared with existing correlations. Also, the experimental pressure drop is measured and compared with microscale p
... Show MoreEssential approaches involving photons are among the most common uses of parallel optical computation due to their recent invention, ease of production, and low cost. As a result, most researchers have concentrated their efforts on it. The Basic Arithmetic Unit BAU is built using a three-step approach that uses optical gates with three states to configure the circuitry for addition, subtraction, and multiplication. This is a new optical computing method based on the usage of a radix of (2): a binary number with a signed-digit (BSD) system that includes the numbers -1, 0, and 1. Light with horizontal polarization (LHP) (↔), light with no intensity (LNI) (⥀), and light with vertical polarization (LVP) (↨) is represen
... Show MoreThe determiner phrase is a syntactic category that appears inside the noun phrase and makes it definite or indefinite or quantifies it. The present study has found wide parametric differences between the English and Arabic determiner phrases in terms of the inflectional features, the syntactic distribution of determiners and the word order of the determiner phrase itself. In English, the determiner phrase generally precedes the head noun or its premodifying adjectival phrase, with very few exceptions where some determiners may appear after the head noun. In Arabic, parts of the determiner phrase precede the head noun and parts of it must appear after the head noun or after its postmodifying adjectival phrase creating a discontinu
... Show MoreIntroduction: Diabetic foot infections are one of the most severe complications of diabetes. This study was aimed to determine the common bacterial isolates of diabetic foot infections and the in vitro antibiotic susceptibility then treatment.
Methods: A swab was taken from the foot ulcer, and the aerobic bacteria were isolated and identified by cultural, microscopic and biochemical test, then by api-20E system. After that their antibiotic susceptibility pattern was determined. Then local and systemic treatment was used to treat the diabetic foot patients.
Results: Bacterial isolates belonging to twelve species were obtained from diabetic foot patients. Gram (-) bacteria were the predominant pathogens in the diabetic foot infection