Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
This paper describes a practical study on the impact of learning's partners, Bluetooth Broadcasting system, interactive board, Real – time response system, notepad, free internet access, computer based examination, and interaction classroom, etc, had on undergraduate student performance, achievement and involving with lectures. The goal of this study is to test the hypothesis that the use of such learning techniques, tools, and strategies to improve student learning especially among the poorest performing students. Also, it gives some kind of practical comparison between the traditional way and interactive way of learning in terms of lectures time, number of tests, types of tests, student's scores, and student's involving with lectures
... Show MoreA new method for determination of allopurinol in microgram level depending on its ability to reduce the yellow absorption spectrum of (I-3) at maximum wavelength ( ?max 350nm) . The optimum conditions such as "concentration of reactant materials , time of sitting and order of addition were studied to get a high sensitivity ( ? = 27229 l.mole-1.cm-1) sandal sensitivity : 0.0053 µg cm-2 ,with wide range of calibration curve ( 1 – 9 µg.ml-1 ) good stability (more then24 hr.) and repeatability ( RSD % : 2.1 -2.6 % ) , the Recovery % : ( 98.17 – 100.5 % ) , the Erel % ( 0.50 -1.83 % ) and the interference's of Xanthine , Cystein , Creatinine , Urea and the Glucose in 20 , 40 , 60 fold of analyate were also studied .
This study aimed at identity baying the difficulties which face public basic school
principals in jar ash governorate in editing formal letters and correspondence and means of
debating with these problems to collect data the researchers developed a question air were
established the population of the study which represents its sample consisted of 129 principals
65 males and 64 females
The results of the study revealed that the principals face difficulties in office and file
management in preparing plans and reports and writing formal letters and answering them
saved recommendations were presented among which were organizing training sessions and
workshops to train the principals on how to dead with there problems.<
Environmental pollution is regarded as a major problem, and traditional strategies such as chemical or physical remediation are not sufficient to overcome the problems of pollution. Petroleum-contaminated soil results in ecological problems, representing a danger to human health. Bioremediation has received remarkable attention, and it is a procedure that uses a biological agent to remove toxic waste from contaminated soil. This approach is easy to handle, inexpensive, and environmentally friendly; its results are highly satisfactory. Bioremediation is a biodegradation process in which the organic contaminants are completely mineralized to inorganic compounds, carbon dioxide, and water. This review discusses the bioremediation of petroleum-
... Show MoreThis paper considers a new Double Integral transform called Double Sumudu-Elzaki transform DSET. The combining of the DSET with a semi-analytical method, namely the variational iteration method DSETVIM, to arrive numerical solution of nonlinear PDEs of Fractional Order derivatives. The proposed dual method property decreases the number of calculations required, so combining these two methods leads to calculating the solution's speed. The suggested technique is tested on four problems. The results demonstrated that solving these types of equations using the DSETVIM was more advantageous and efficient
Linear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreIn this study, nano TiO2 was prepared with titanium isopropoxide (TTIP) as a resource to titanium oxide. The catalyst was synthesized using phosphotungstic acid (PTA) and, stearyl trimethyl ammonium bromide (STAB) was used as the structure-directing material. Characterization of the product was done by the X-ray diffraction (XRD), X-ray fluorescent spectroscopy (XRF), nitrogen adsorption/desorption measurements, Atomic Force Microscope (AFM) and Fourier transform infrared (FTIR) spectra, were used to characterize the calcined TiO2 nanoparticles by STAB and PWA. The TiO2 nanomaterials were prepared in three crystalline forms (amorphous, anatase, anatase-rutile). The results showed that the nanoparticles of anatase TiO2 have good cata
... Show MoreThis research is concerned to investigate the behavior of reinforced concrete (RC) deep beams strengthened with carbon fiber reinforced polymer (CFRP) strips. The experimental part of this research is carried out by testing seven RC deep beams having the same dimensions and steel reinforcement which have been divided into two groups according to the strengthening schemes. Group one was consisted of three deep beams strengthened with vertical U-wrapped CFRP strips. While, Group two was consisted of three deep beams strengthened with inclined CFRP strips oriented by 45o with the longitudinal axis of the beam. The remaining beam is kept unstrengthening as a reference beam. For each group, the variable considered
... Show More