Preferred Language
Articles
/
7hb2-okBVTCNdQwCe46x
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for many applications dismissing the use of DL. Having sufficient data is the first step toward any successful and trustworthy DL application. This paper presents a holistic survey on state-of-the-art techniques to deal with training DL models to overcome three challenges including small, imbalanced datasets, and lack of generalization. This survey starts by listing the learning techniques. Next, the types of DL architectures are introduced. After that, state-of-the-art solutions to address the issue of lack of training data are listed, such as Transfer Learning (TL), Self-Supervised Learning (SSL), Generative Adversarial Networks (GANs), Model Architecture (MA), Physics-Informed Neural Network (PINN), and Deep Synthetic Minority Oversampling Technique (DeepSMOTE). Then, these solutions were followed by some related tips about data acquisition needed prior to training purposes, as well as recommendations for ensuring the trustworthiness of the training dataset. The survey ends with a list of applications that suffer from data scarcity, several alternatives are proposed in order to generate more data in each application including Electromagnetic Imaging (EMI), Civil Structural Health Monitoring, Medical imaging, Meteorology, Wireless Communications, Fluid Mechanics, Microelectromechanical system, and Cybersecurity. To the best of the authors’ knowledge, this is the first review that offers a comprehensive overview on strategies to tackle data scarcity in DL.</p>
Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
DEO: A Dynamic Event Order Strategy for t-way Sequence Covering Array Test Data Generation
...Show More Authors

Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Mon May 11 2020
Journal Name
Baghdad Science Journal
DEO: A Dynamic Event Order Strategy for t-way Sequence Covering Array Test Data Generation
...Show More Authors

Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of

... Show More
View Publication
Publication Date
Sun Oct 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A novel data offloading scheme for QoS optimization in 5G based internet of medical things
...Show More Authors

The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat

... Show More
Publication Date
Mon Apr 09 2018
Journal Name
Al-khwarizmi Engineering Journal
Experimental Evaluation and Finite Element Simulation to Produce Square Cup by Deep Drawing Process
...Show More Authors

Deep drawing process to produce square cup is very complex process due to a lot of process parameters which control on this process, therefore associated with it many of defects such as earing, wrinkling and fracture. Study of the effect of some process parameters to determine the values of these parameters which give the best result, the distributions for the thickness and depths of the cup were used to estimate the effect of the parameters on the cup numerically, in addition to experimental verification just to the conditions which give the best numerical predictions in order to reduce the time, efforts and costs for producing square cup with less defects experimentally is the aim of this study. The numerical analysis is used to study

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Sat Dec 31 2022
Journal Name
Iraqi Geological Journal
Geochemical Criteria for Discriminating Shallow and Deep Environments in Oligocene-Miocene Succession, Western Iraq
...Show More Authors

The geochemical study of the Oligocene-Miocene succession Anah, Euphrates, and Fatha formations, western Iraq, was carried out to discriminate their depositional environments. Different major and trace patterns were observed between these formations. The major elements (Ca, Mg, Fe, Mn, K, and Na) and trace elements (Li, V, Cr, Co, Ni, Cu, Zn, Ga, Rb, Sr, Zr, Cs, Ba, Hf, W, Pb, Th, and U) are a function of the setting of the depositional environments. The reefal facies have lower concentrations of MgO, Li, Cr, Co, Ni, Ga, Rb, Zr, and Ba than marine and lagoonal facies but have higher concentrations of CaO, V, and Sr than it. Whereas dolomitic limestone facies are enriched V, and U while depletion in Li, Cr, Ni, Ga, Rb, Sr, Zr, Ba, an

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution for Progressively Censoring Data with S- Function about COVID-19
...Show More Authors

The two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Tue Oct 01 2013
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected an

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Aug 01 2023
Journal Name
Journal Of Materials Science: Materials In Electronics
Fabrication and characterization of CoxMn0.25−xMg0.75Fe2O4 nanoparticles for H2S sensing applications
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed May 24 2023
Journal Name
College Of Islamic Sciences
Analyzing the likeness of fundamentalists and examples of its jurisprudential applications
...Show More Authors

The new events in every era are endless, and it is not required of the legal texts to pursue each event by itself and attach to it its ruling.
At the same time, every event or action must have a Shari’a ruling according to the wise Lawgiver, and our scholars have noted this in every event presented to them. ...etc.
It is well known that reaching the legal ruling on a matter, by examining the detailed evidence, is subject to following the path of the rules and regulations specific to the overall evidence, which we organize on the basis of the principles of jurisprudence.
Therefore, any disagreement about the manner or content of these rules will have an impact on the difference of jurists in partial issues, when examining the d

... Show More
View Publication Preview PDF