The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Transformers (BERT), and FastText embeddings follows our approach, which comprises exhaustive preprocessing operations including stemming, stopword deletion, and ways to address class imbalance. Training and evaluation of the hybrid BiLSTM-CNN model on several benchmark datasets, including SDG-labeled corpora and relevant external datasets like GoEmotion and Ohsumed, help provide a complete assessment of the model’s generalizability. Moreover, this study utilizes zero-shot prompt-based categorization using GPT-3.5/4 and Flan-T5, thereby providing a comprehensive benchmark against current approaches and doing comparative tests using leading models such as Robustly Optimized BERT Pretraining Approach (RoBERTa) and Decoding-enhanced BERT with Disentangled Attention (DeBERTa). Experimental results show that the proposed hybrid model achieves competitive performance due to contextual embeddings, which greatly improve classification accuracy. The study explains model decision processes and improves openness using interpretability techniques, including SHapley Additive exPlanations (SHAP) analysis and attention visualization. These results emphasize the need to incorporate rapid engineering techniques alongside deep learning architectures for effective and interpretable SDG text categorization. With possible effects on more general uses in policy analysis and scientific literature mining, this work offers a scalable and transparent solution for automating the evaluation of SDG research.
Background: Since the invention of laser in 1960, lasers have been developed and approved in many fields. Lasers can now be regarded as practical tools with unique properties that have been utilized effectively in several applications in fields of medical and biological sciences.Objectives: The aim of the current study was to preparation of vaccines (live attenuated and killed) by irradiation of the bacteria by the low level diode laser.Methods: six bacterial isolates were isolated from human samples of diabetic foot infections, which used for preparation of vaccines. The experiment was conducted on fifteen adult male rabbits; they were divided into three groups with 5 rabbits each. Blood samples were collected from the marginal ear vein
... Show More إن المقصود باختبارات حسن المطابقة هو التحقق من فرضية العدم القائمة على تطابق مشاهدات أية عينة تحت الدراسة لتوزيع احتمالي معين وترد مثل هكذا حالات في التطبيق العملي بكثرة وفي كافة المجالات وعلى الأخص بحوث علم الوراثة والبحوث الطبية والبحوث الحياتية ,عندما اقترح كلا من Shapiro والعالم Wilk عام 1965 اختبار حسن المطابقة الحدسي مع معالم القياس
(
The aim of this paper is to present the numerical method for solving linear system of Fredholm integral equations, based on the Haar wavelet approach. Many test problems, for which the exact solution is known, are considered. Compare the results of suggested method with the results of another method (Trapezoidal method). Algorithm and program is written by Matlab vergion 7.
In this work, the fractional damped Burger's equation (FDBE) formula = 0,
The analysis, behavior of two-phase flow incompressible fluid in T-juction is done by using "A Computational Fluid Dynamic (CFD) model" that application division of different in industries. The level set method was based in “Finite Element method”. In our search the behavior of two phase flow (oil and water) was studed. The two-phase flow is taken to simulate by using comsol software 4.3. The multivariable was studying such as velocity distribution, share rate, pressure and the fraction of volume at various times. The velocity was employed at the inlet (0.2633, 0.1316, 0.0547 and 0.0283 m/s) for water and (0.1316 m/s) for oil, over and above the pressure set at outlet as a boundary condition. It was observed through the program
... Show MoreImproving students’ use of argumentation is front and center in the increasing emphasis on scientific practice in K-12 Science and STEM programs. We explore the construct validity of scenario-based assessments of claim-evidence-reasoning (CER) and the structure of the CER construct with respect to a learning progression framework. We also seek to understand how middle school students progress. Establishing the purpose of an argument is a competency that a majority of middle school students meet, whereas quantitative reasoning is the most difficult, and the Rasch model indicates that the competencies form a unidimensional hierarchy of skills. We also find no evidence of differential item functioning between different scenarios, suggesting
... Show More<p>Analyzing X-rays and computed tomography-scan (CT scan) images using a convolutional neural network (CNN) method is a very interesting subject, especially after coronavirus disease 2019 (COVID-19) pandemic. In this paper, a study is made on 423 patients’ CT scan images from Al-Kadhimiya (Madenat Al Emammain Al Kadhmain) hospital in Baghdad, Iraq, to diagnose if they have COVID or not using CNN. The total data being tested has 15000 CT-scan images chosen in a specific way to give a correct diagnosis. The activation function used in this research is the wavelet function, which differs from CNN activation functions. The convolutional wavelet neural network (CWNN) model proposed in this paper is compared with regular convol
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show More