Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
The vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc
The objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreSixty samples of commercially available contact lens solutions were collected from students at the Pharmacy College/Baghdad University. The types of lenses used varied from medical to cosmetic. They were cultured to diagnose any microbial contamination within the solutions. Both used and unused solutions were subject for culturing. Thirty six (60%) used samples showed bacterial growth, fungal growth was absent. Pseudomonas aeruginosa accounts for the highest number of isolates (25%) followed by E. coli (21%), Staphylococcus epidermidis (6.6%), Pseudomonas fluorescence (5%) and Proteus mirabilis (1.6%) respectively. Only one (1) unused (sealed) sample showed growth of P. fluorescence.
... Show MoreThis research aims to test the ability of glass waste powder to adsorb cadmium from aqueous solutions. The glass wastes were collected from the Glass Manufacturing Factory in Ramadi. The effect of concentration and reaction time on sorption was tested through a series of laboratory experiments. Four Cd concentrations (20, 40, 60, and 80) as each concentration was tested ten times for 5, 10, 15, 20, 25, 30, 35, 40, 45, and 50 min. Solid (glass wastes) to liquid was 2g to 30ml was fixed in each experiment where the total volume of the solution was 30ml. The pH, total dissolved salts and electrical conductivity were measured at 30ºC. The equilibrium concentration was determined at 25 minutes, thereafter it was noted that the sorption
... Show MoreThis research including, CO3O4 was prepared by the chemical spry pyrolysis, deposited film acceptable to assess film properties and applications as photodetector devise, studying the optical and optoelectronics properties of Cobalt Oxide and effect of different doping ratios with Br (2, 5, 8)%. the optical energy gap for direct transition were evaluated and it decreases as the percentage Br increase, Hall measurements showed that all the films are p-type, the current–voltage characteristic of Br:CO3O4 /Si Heterojunction show change forward current at dark varies with applied voltage, high spectral response, specific detectivity and quantum efficiency of CO3O4 /Si detector with 8% of Br ,was deliberate, extreme value with 673nm.
... Show MoreAThe Bridge Maintenance Management System (BMMS) is an application system that uses existing data from a Bridge Management System database for monitoring and analysis of current bridges performance, as well as for estimating the current and future maintenance and rehabilitation needs of the bridges. In a transportation context, the maintenance management is described as a cost-effective process to operate, construct, and maintain physical money. This needs analytical tools to support the allocation of resources, materials, equipment, including personnel, and supplies. Therefore, Geographic Information System (GIS) can be considered as one tool to develop the road and bridge maintenanc
Owing to their remarkable characteristics, refractory molybdenum nitride (MoN x )-based compounds have been deployed in a wide range of strategic industrial applications. This review reports the electronic and structural properties that render MoN x materials as potent catalytic surfaces for numerous chemical reactions and surveys the syntheses, procedures, and catalytic applications in pertinent industries such as the petroleum industry. In particular, hydrogenation, hydrodesulfurization, and hydrodeoxygenation are essential processes in the refinement of oil segments and their conversions into commodity fuels and platform chemicals. N-vacant sites over a catalyst’s surface are a significant driver of diverse chemical phenomena. Studies
... Show MoreOwing to their remarkable characteristics, refractory molybdenum nitride (MoNx)-based compounds have been deployed in a wide range of strategic industrial applications. This review reports the electronic and structural properties that render MoNx materials as potent catalytic surfaces for numerous chemical reactions and surveys the syntheses, procedures, and catalytic applications in pertinent industries such as the petroleum industry. In particular, hydrogenation, hydrodesulfurization, and hydrodeoxygenation are essential processes in the refinement of oil segments and their conversions into commodity fuels and platform chemicals. N-vacant sites over a catalyst’s surface are a significant driver of diverse chemical phenomena. Studies on
... Show MoreIn this paper, the discriminant analysis is used to classify the most wide spread heart diseases known as coronary heart diseases into two groups (patient, not patient) based on the changes of discrimination features of ten predictor variables that we believe they cause the disease . A random sample for each group is employed and the stepwise procedures are performed in order to delete those variables that are not important for separating the groups. Tests of significance of discriminant analysis and estimating the misclassification rates are performed
If the Industrial Revolution has enabled the replacement of humans with machines, the digital revolution is moving towards replacing our brains with artificial intelligence, so it is necessary to consider how this radical transformation affects the graphic design ecosystem. Hence, the research problem emerged (what are the effects of artificial intelligence on graphic design) and the research aimed to know the capabilities and effects of artificial intelligence applications in graphic design, and the study dealt in its theoretical framework with two main axes, the first is the concept of artificial intelligence, and the second is artificial intelligence applications in graphic design. The descriptive approach adopted a method of content
... Show More