Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
In this study, hydroxyapatite (HAP, Ca10(PO4)6(OH)2) has been prepared as bioceramic material with biological specifications useful to used for orthopedic and dental implant applications. Wet chemical processing seems to form the fine grain size and uniform characteristic nanocrystalline materials by the interstice factors controlling which affected the grain size and crystallinity in order to give good mechanical and/or constituent properties similar as natural bone. Fluorinated hydroxyapatite [4-6 wt% F, (FHA, Ca10(PO4)6(OH)2–Fx] was developed in new method for its posses to increased strength and to give higher corrosion resistance in biofluids than pure HAP moreover reduces the risk of dental caries. The phase's and functional groups
... Show MoreSince the emergence of the science of international relations as an independent academic scientific field, various theories and trends have appeared and have tried to understand and explain the international reality and give a clear picture of what is happening within the international system of interactions and influences and the search for tools for stability and peace in international relations. Among these theories is the feminist theory, which is a new intellectual trend on the level of international relations theories, which tried to give an explanation of what is happening in world politics and in international relations in particular. The main issue that feminist theory is concerned with is the lack of women’s subordination
... Show MoreRecently, all over the world mechanism of cloud computing is widely acceptable and used by most of the enterprise businesses in order increase their productivity. However there are still some concerns about the security provided by the cloud environment are raises. Thus in this our research project, we are discussing over the cloud computing paradigm evolvement for the large business applications like CRM as well as introducing the new framework for the secure cloud computing using the method of IT auditing. In this case our approach is basically directed towards the establishment of the cloud computing framework for the CRM applications with the use of checklists by following the data flow of the CRM application and its lifecycle. Those ch
... Show MoreIn this work, a simulated study was carried out for designing a novel spiral rectangular patch of microstrip antenna that is used in ultra-wideband applications by using a high frequency structure simulator software (HFSS). A substrate with dielectric constant of 4.4 and height 2.10 mm (commercial substrate height available is about 0.8-1.575 mm) has been used for the design of the proposed antenna. The design basis for enhancing bandwidth in the frequency range 6.63 - 10.93 GHz is based on increasing the edge areas that positively affect the antenna's efficiency. This design makes the designed antenna cost less by reducing the area of the patch. It has been noticed that the bandwidth of the antenna under this study is increasing to 4.30
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreStatistical learning theory serves as the foundational bedrock of Machine learning (ML), which in turn represents the backbone of artificial intelligence, ushering in innovative solutions for real-world challenges. Its origins can be linked to the point where statistics and the field of computing meet, evolving into a distinct scientific discipline. Machine learning can be distinguished by its fundamental branches, encompassing supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. Within this tapestry, supervised learning takes center stage, divided in two fundamental forms: classification and regression. Regression is tailored for continuous outcomes, while classification specializes in c
... Show MoreChallenges facing the transition of traditional cities to smart: Studying the challenges faced by the transition of a traditional area such as Al-Kadhimiya city center to the smart style
A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The design was distinguished in Late Twenty-First Century With new and new methods Through which the ability to adapt all technical media in the formation of two-dimensional and three-dimensional figures and shapes was achieved .
Which led to the emergence of endless sets of design ideas characterized by the heterogeneity of design forms and design solutions that preceded it. The designer could not access these creations in various architectural and artistic fields only through computer programs, especially those related to the activation of mathematical logic and what is known as algorithms in the formation and construction of the form, which led to the emergence of the "parametric direction" and the problem of research is summarized