Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
An essential tool for studying the web is its ability to show how energy moves through an ecosystem. Understanding and elucidating the relationship between species variety and their placement within the inclusive trophic dynamics is also beneficial. A food web ecological model with prey and two rival predators under fear and wind flow conditions is developed in this article. The boundedness and positivity of the system’s solution are established mathematically. The stability and existence constraints of the system’s equilibria are examined. The proposed system’s persistence limitations are established. Additionally, the bifurcation analysis of every potential equilibrium is examined using the Sotomayor theorem. To describe the
... Show MoreSorghum seeds suffer from a low germination ratio, so a factorial experiment was carried out in the Seed Technology Laboratory, Department of Field Crops, College of Agricultural Engineering Sciences, University of Baghdad during 2022 according to a Completely Randomized Design with four replications to study the effect of stimulating seeds with aqueous extract of banana peels with a concentration of (0, 15, 25 and 35%) and citric acid at concentrations (0, 50, 100 and 200 mg L-1) on viability and vigour of seed properties. Seeds that soaked with banana peel extract at a concentration of 25% outperformed in first count (79.8%), final count (85.0%), radicle length (13.2 cm), plumule length (11.6 cm), and seedling vigour index (2109), noting
... Show MoreIn this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreMultiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThis study has been accomplished by testing three different models to determine rocks type, pore throat radius, and flow units for Mishrif Formation in West Qurna oilfield in Southern Iraq based on Mishrif full diameter cores from 20 wells. The three models that were used in this study were Lucia rocks type classification, Winland plot was utilized to determine the pore throat radius depending on the mercury injection test (r35), and (FZI) concepts to identify flow units which enabled us to recognize the differences between Mishrif units in these three categories. The study of pore characteristics is very significant in reservoir evaluation. It controls the storage mechanism and reservoir fluid prope
Artificial intelligence (AI) offers significant benefits to biomedical research and academic writing. Nevertheless, using AI-powered writing aid tools has prompted worries about excessive dependence on these tools and their possible influence on writing proficiency. The current study aimed to explore the academic staff’s perspectives on the impact of AI on academic writing. This qualitative study incorporated in-person interviews with academic faculty members. The interviews were conducted in a semi-structured manner, using a predetermined interview guide consisting of open-ended questions. The interviews were done in person with the participants from May to November 2023. The data was analyzed using thematic analysis. Ten academics aged
... Show MoreThe effects of Internet use on university’s students:The effects of Internet use on university’s students:“A Study on a Sample of Jordanian University’s students "This survey aims to identify the most important effects of Internet use on Jordanian public and private universities’ students by monitoring and analyzing a set of indicators that show the quality of the effects on specific fields such as cultural, social, psychological, moral and political effects .To achieve these goals, the study attempts to answer the following questions:1. What are the effects of Internet’s use on students?2. What is the relationship between the effects and demographic variables such as gender, age, family size an
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show More