Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
The 3D electro-Fenton technique is, due to its high efficiency, one of the technologies suggested to eliminate organic pollutants in wastewater. The type of particle electrode used in the 3D electro-Fenton process is one of the most crucial variables because of its effect on the formation of reactive species and the source of iron ions. The electrolytic cell in the current study consisted of graphite as an anode, carbon fiber (CF) modified with graphene as a cathode, and iron foam particles as a third electrode. A response surface methodology (RSM) approach was used to optimize the 3D electro-Fenton process. The RSM results revealed that the quadratic model has a high R2 of 99.05 %. At 4 g L-1 iron foam particles, time of 5 h, and
... Show MoreMany studies have evaluated the effect of platelet rich plasma (PRP) in the treatment of non-union fractures but few studies have investigated their effect on the union of femoral neck fractures or their functional outcome in young adults. The aim of this study was to evaluate the union time and functional outcome in young adult patients with femoral neck fracture managed by three cannulated screws injected with PRP and those managed by fixation only. This prospective study included 24 patients diagnosed with femoral neck fractures within 24 hours of presentation. Twelve cases in group A were managed by closed reduction and three cannulated screws fixation injected with PRP; twelve patients in group B were managed only by closed reduction a
... Show MoreThis dissertation studies the application of equivalence theory developed by Mona Baker in translating Persian to Arabic. Among various translation methodologies, Mona Baker’s bottom-up equivalency approach is unique in several ways. Baker’s translation approach is a multistep process. It starts with studying the smallest linguistic unit, “the word”, and then evolves above the level of words leading to the translation of the entire text. Equivalence at the word level, i.e., word for word method, is the core point of Baker’s approach.
This study evaluates the use of Baker’s approach in translation from Persian to Arabic, mainly because finding the correct equivalence is a major challenge in this translation. Additionall
... Show MoreThis study was designed to compare the effect of two types of viral hepatitis A and E (HAV
and HEV) on liver functions in Iraqi individuals by the measurement of biochemical changes
associated with hepatitis.
The study performed on 58 HEV and 66 HAV infected patients compared with 28 healthy
subjects. The measured biochemical tests include total serum bilirubin, serum transminases (ALT
and AST) alkaline phosphatase (ALP) and gamma glutamyl transferase (GGT).
The study showed that adolescent and young adults (17-29) years, were mostly affected by
HEV while children (5-12) years were frequently affected by HAV. The severity of liver damage in
HEV patients was higher than HAV patients as a result of high serum transa
Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show MoreBackground: spontaneous abortion constitutes one of the most important adverse pregnancy outcomes affecting human reproduction, and its risk factors are not only affected by biological, demographic factors such as age, gravidity, and previous history of miscarriage,but also by individual women’s personal social characteristics, and by the larger social environment. Objective:To identifyEnvironmental effects on Women's with Spontaneous Abortion. Methodology:Non-probability(purposive sample)of(200) women, who were suffering from spontaneous abortion in maternity unitfrom four hospitals at Baghdad City which include Al-ElwiaMaternity Teaching Hospital, and Baghdad Teaching Hospital at Al-Russafa sector. Al–karckhMaternityHospita
... Show MoreIn this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.
Background: Fibromyalgia syndrome (FMS) is the
most common rheumatic cause of diffuse pain and
multiple regional musculoskeletal pain and disability.
Objective: is to assess the contribution of serum
lipoprotein (A) in the pathogenesis of FMS patients.
Methods: One hundred twenty two FMS patients
were compared with 60 healthy control individuals
who were age and sex matched. All FMS features and
criteria are applied for patients and controls; patients
with secondary FMS were excluded. Serum
Lipoprotein (A): [Lp(A)], body mass index (BMI), &
s.lipid profile were determined for both groups.
Results: There was a statistical significant difference
between patients &controls in serum lipoprotein
In this research, nanofibers have been prepared by using an electrospinning method. Three types of polymer (PVA, VC, PMMA) have been used with different concentration. The applied voltage and the gap length were changed. It was observed that VC is the best polymer than the other types of polymers.