Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
Assessment of annual wind energy potential for three selected sites in Iraq has been analyzed in the present work. The wind velocities data from August 2014 to July 2015 were collected from the website of Weather Underground Organization (WUO) at stations elevation (35m, 32m, and 17m) for Baghdad, Najaf, and Kut Al-Hai respectively. Extrapolation of stations elevation and wind velocities was used to estimate wind velocities at (60m, 90m, and 120m). The objectives are to analyze the wind speed data and assess the wind energy potential for wind energy applications. Computer code for MATLAB software has been developed to solve the mathematical model. The results are presented as a monthly and annual average for wind velocities, standard deviat
... Show MoreIn this work, silicon nitride (Si3N4) thin films were deposited on metallic substrates (aluminium and titanium sheets) by the DC reactive sputtering technique using two different silicon targets (n-type and p-type Si wafers) as well as two Ar:N2 gas mixing ratios (50:50 and 70:30). The electrical conductivity of the metallic (aluminium and titanium) substrates was measured before and after the deposition of silicon nitride thin films on both surfaces of the substrates. The results obtained from this work showed that the deposited films, in general, reduced the electrical conductivity of the substrates, and the thin films prepared from n-type silicon targets using a 50:50 mixing ratio and deposited on both
... Show MoreIn this study tungsten oxide and graphene oxide (GO-WO2.89) were successfully combined using the ultra-sonication method and embedded with polyphenylsulfone (PPSU) to prepare novel low-fouling membranes for ultrafiltration applications. The properties of the modified membranes and performance were investigated using Fourier-transform infrared spectroscopy (FT-IR), scanning electron microscopy (SEM), contact angle (CA), water permeation flux, and bovine serum albumin (BSA) rejection. It was found that the modified PPSU membrane fabricated from 0.1 wt.% of GO-WO2.89 possessed the best characteristics, with a 40.82° contact angle and 92.94% porosity. The permeation flux of the best membrane was the highest. The pure water permeation f
... Show MoreEconomic analysis plays a pivotal role in managerial decision-making processes. This analysis is predicated on deeply understanding economic forces and market factors influencing corporate strategies and decisions. This paper delves into the role of economic data analysis in managing small and medium-sized enterprises (SMEs) to make strategic decisions and enhance performance. The study underscores the significance of this approach and its impact on corporate outcomes. The research analyzes annual reports from three companies: Al-Mahfaza for Mobile and Internet Financial Payment and Settlement Services Company Limited, Al-Arab for Electronic Payment Company, and Iraq Electronic Gateway for Financial Services Company. The paper concl
... Show MoreIn a hybrid cooling solar thermal systems , a solar collector is used to convert solar energy into heat energy in order to super heat the refrigerant leaving the compressor, and this process helps in the transformation of refrigerant state from gaseous state to the liquid state in upper two-thirds of the condenser instead of the lower two-thirds such as in the traditional air-conditioning systems and this will reduce the energy needed to run the process of cooling .In this research two systems with a capacity of 2 tons each were used, a hybrid air-conditioning system with an evacuated tubes solar collector and a traditional air-conditioning system . The refrigerant of each type was R22.The comparison was in the amou
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIt has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlighting the diff
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator
... Show More