Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.
Systems on Chips (SoCs) architecture complexity is result of integrating a large numbers of cores in a single chip. The approaches should address the systems particular challenges such as reliability, performance, and power constraints. Monitoring became a necessary part for testing, debugging and performance evaluations of SoCs at run time, as On-chip monitoring is employed to provide environmental information, such as temperature, voltage, and error data. Real-time system validation is done by exploiting the monitoring to determine the proper operation of a system within the designed parameters. The paper explains the common monitoring operations in SoCs, showing the functionality of thermal, voltage and soft error monitors. The different
... Show MoreResearch summary
Today, the Islamic nation is going through a phase that is one of the most dangerous that it has never experienced before. This phase was characterized by the following:
The nation is divided into states and its weakness in most of its doctrinal, political, social, economic and moral aspects.
Enemies targeting the nation's faith and capabilities, and the emergence of loyalty to the enemies of the nation from some groups of society, spreading misconceptions in the Muslim community.
Spreading the spirit of rebellion in all segments of society and striving to stir up the people against the rulers and put pressure on the rulers.
All these manifestations and others require the nation's wise men
... Show MoreAn efficient combination of Adomian Decomposition iterative technique coupled Elzaki transformation (ETADM) for solving Telegraph equation and Riccati non-linear differential equation (RNDE) is introduced in a novel way to get an accurate analytical solution. An elegant combination of the Elzaki transform, the series expansion method, and the Adomian polynomial. The suggested method will convert differential equations into iterative algebraic equations, thus reducing processing and analytical work. The technique solves the problem of calculating the Adomian polynomials. The method’s efficiency was investigated using some numerical instances, and the findings demonstrate that it is easier to use than many other numerical procedures. It has
... Show MoreIn this article, the research presents a general overview of deep learning-based AVSS (audio-visual source separation) systems. AVSS has achieved exceptional results in a number of areas, including decreasing noise levels, boosting speech recognition, and improving audio quality. The advantages and disadvantages of each deep learning model are discussed throughout the research as it reviews various current experiments on AVSS. The TCD TIMIT dataset (which contains top-notch audio and video recordings created especially for speech recognition tasks) and the Voxceleb dataset (a sizable collection of brief audio-visual clips with human speech) are just a couple of the useful datasets summarized in the paper that can be used to test A
... Show MoreDeep Learning Techniques For Skull Stripping of Brain MR Images
If the State attaches great importance to its foreign relations and intends to strengthen them in order to ensure the achievement of the highest national goals and interests. External relations between countries are one of the most prominent features of foreign policy, which depends on a combination of internal and external factors, the modern relations between Tunisia and Tunisia, which goes back to the pre-independence of Tunisia, when Iraq was a supporter of Tunisia's independence from France in the 1940s, Although these relations did not cause any disturbance by the two countries, but they remained weak relations did not develop in all areas except the sports and cultural field, which we will determine the reasons and the pos
... Show MoreDrought is a natural phenomenon in many arid, semi-arid, or wet regions. This showed that no region worldwide is excluded from the occurrence of drought. Extreme droughts were caused by global weather warming and climate change. Therefore, it is essential to review the studies conducted on drought to use the recommendations made by the researchers on drought. The drought was classified into meteorological, agricultural, hydrological, and economic-social. In addition, researchers described the severity of the drought by using various indices which required different input data. The indices used by various researchers were the Joint Deficit Index (JDI), Effective Drought Index (EDI), Streamflow Drought Index (SDI), Sta
... Show MoreFace recognition, emotion recognition represent the important bases for the human machine interaction. To recognize the person’s emotion and face, different algorithms are developed and tested. In this paper, an enhancement face and emotion recognition algorithm is implemented based on deep learning neural networks. Universal database and personal image had been used to test the proposed algorithm. Python language programming had been used to implement the proposed algorithm.