Preferred Language
Articles
/
DkK4vJoBMeyNPGM3ds8C
Towards Accurate SDG Research Categorization: A Hybrid Deep Learning Approach Using Scopus Metadata
...Show More Authors

The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Transformers (BERT), and FastText embeddings follows our approach, which comprises exhaustive preprocessing operations including stemming, stopword deletion, and ways to address class imbalance. Training and evaluation of the hybrid BiLSTM-CNN model on several benchmark datasets, including SDG-labeled corpora and relevant external datasets like GoEmotion and Ohsumed, help provide a complete assessment of the model’s generalizability. Moreover, this study utilizes zero-shot prompt-based categorization using GPT-3.5/4 and Flan-T5, thereby providing a comprehensive benchmark against current approaches and doing comparative tests using leading models such as Robustly Optimized BERT Pretraining Approach (RoBERTa) and Decoding-enhanced BERT with Disentangled Attention (DeBERTa). Experimental results show that the proposed hybrid model achieves competitive performance due to contextual embeddings, which greatly improve classification accuracy. The study explains model decision processes and improves openness using interpretability techniques, including SHapley Additive exPlanations (SHAP) analysis and attention visualization. These results emphasize the need to incorporate rapid engineering techniques alongside deep learning architectures for effective and interpretable SDG text categorization. With possible effects on more general uses in policy analysis and scientific literature mining, this work offers a scalable and transparent solution for automating the evaluation of SDG research.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Nov 26 2025
Journal Name
Al–bahith Al–a'alami
IRAQ YOUTH TRENDS TOWARDS CELEBRITY ADVERTISEMENTS ON SOCIAL MEDIA : (A research taken from a Master Degree thesis)
...Show More Authors

In the light of what is witnessing in the advertising arena of new ways and methods in delivering advertising message to consumers by finding new outlets to communicate with them especially through social networking sites, which became the first choice of advertising companies in order to spread its goods and services. These companies now are relying gradually on celebrities to appear with their products and goods to drive the audience's  attention towards them. The thesis aims to find out the attitudes of young people towards the the advertisements that show famous celebrities on social networking sites. The researcher used survey method which aims to record, analyze and interpret the phenomenon after collecting the necessa

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (534)
Crossref (527)
Scopus Clarivate Crossref
Publication Date
Wed Mar 02 2022
Journal Name
Journal Of Educational And Psychological Researches
King Khalid University towards Strategies Compatible with Brain-Based Learning (BBL)
...Show More Authors

The study aimed to reveal the level of knowledge and tendencies of high- study students specializing in curriculum and teaching methods at King Khalid University towards harmonious strategies with brain-based learning (BBL). And Then, putting a proposed concept to develop knowledge and tendencies of high-study students specializing in curriculum and teaching methods at King Khalid University towards harmonious strategies with Brain-based learning (BBL). For achieving this goal, a cognitive test and a scale of tendency were prepared to apply harmonious strategies with brain-based learning. The descriptive approach was used because it suits the goals of the study. The study sample consisted of (70) male and female students of postgraduate

... Show More
View Publication Preview PDF
Publication Date
Mon Oct 30 2023
Journal Name
Iraqi Journal Of Science
Machine Learning Approach for Facial Image Detection System
...Show More Authors

HM Al-Dabbas, RA Azeez, AE Ali, Iraqi Journal of Science, 2023

View Publication
Scopus (7)
Scopus
Publication Date
Thu Jun 06 2024
Journal Name
Journal Of Applied Engineering And Technological Science (jaets)
Deep Learning and Its Role in Diagnosing Heart Diseases Based on Electrocardiography (ECG)
...Show More Authors

Diagnosing heart disease has become a very important topic for researchers specializing in artificial intelligence, because intelligence is involved in most diseases, especially after the Corona pandemic, which forced the world to turn to intelligence. Therefore, the basic idea in this research was to shed light on the diagnosis of heart diseases by relying on deep learning of a pre-trained model (Efficient b3) under the premise of using the electrical signals of the electrocardiogram and resample the signal in order to introduce it to the neural network with only trimming processing operations because it is an electrical signal whose parameters cannot be changed. The data set (China Physiological Signal Challenge -cspsc2018) was ad

... Show More
View Publication
Scopus Crossref
Publication Date
Thu Nov 01 2018
Journal Name
International Journal Of Biomathematics
A non-conventional hybrid numerical approach with multi-dimensional random sampling for cocaine abuse in Spain
...Show More Authors

This paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ

... Show More
View Publication
Scopus (8)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Wed Oct 21 2015
Journal Name
Integrated Journal Of Engineering Research And Technology
A HYBRID CUCKOO SEARCH AND BACK-PROPAGATION ALGORITHMS WITH DYNAMIC LEARNING RATE TO SPEED UP THE CONVERGENCE (SUBPL) ALGORITHM
...Show More Authors

BP algorithm is the most widely used supervised training algorithms for multi-layered feedforward neural net works. However, BP takes long time to converge and quite sensitive to the initial weights of a network. In this paper, a modified cuckoo search algorithm is used to get the optimal set of initial weights that will be used by BP algorithm. And changing the value of BP learning rate to improve the error convergence. The performance of the proposed hybrid algorithm is compared with the stan dard BP using simple data sets. The simulation result show that the proposed algorithm has improved the BP training in terms of quick convergence of the solution depending on the slope of the error graph.

Publication Date
Sat Apr 01 2023
Journal Name
The Ocular Surface
Detecting dry eye from ocular surface videos based on deep learning
...Show More Authors

View Publication
Scopus (20)
Crossref (19)
Scopus Clarivate Crossref
Publication Date
Mon Apr 07 2025
Journal Name
Al-nahrain Journal For Engineering Sciences
Navigating the Challenges and Opportunities of Tiny Deep Learning and Tiny Machine Learning in Lung Cancer Identification
...Show More Authors

Lung cancer is the most common dangerous disease that, if treated late, can lead to death. It is more likely to be treated if successfully discovered at an early stage before it worsens. Distinguishing the size, shape, and location of lymphatic nodes can identify the spread of the disease around these nodes. Thus, identifying lung cancer at the early stage is remarkably helpful for doctors. Lung cancer can be diagnosed successfully by expert doctors; however, their limited experience may lead to misdiagnosis and cause medical issues in patients. In the line of computer-assisted systems, many methods and strategies can be used to predict the cancer malignancy level that plays a significant role to provide precise abnormality detectio

... Show More
View Publication
Scopus Crossref
Publication Date
Mon Sep 01 2025
Journal Name
Journal Of Information Hiding And Multimedia Signal Processing
Steganography Based on Image Compression Using a Hybrid Technique
...Show More Authors

Information security is a crucial factor when communicating sensitive information between two parties. Steganography is one of the most techniques used for this purpose. This paper aims to enhance the capacity and robustness of hiding information by compressing image data to a small size while maintaining high quality so that the secret information remains invisible and only the sender and recipient can recognize the transmission. Three techniques are employed to conceal color and gray images, the Wavelet Color Process Technique (WCPT), Wavelet Gray Process Technique (WGPT), and Hybrid Gray Process Technique (HGPT). A comparison between the first and second techniques according to quality metrics, Root-Mean-Square Error (RMSE), Compression-

... Show More
View Publication