The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Transformers (BERT), and FastText embeddings follows our approach, which comprises exhaustive preprocessing operations including stemming, stopword deletion, and ways to address class imbalance. Training and evaluation of the hybrid BiLSTM-CNN model on several benchmark datasets, including SDG-labeled corpora and relevant external datasets like GoEmotion and Ohsumed, help provide a complete assessment of the model’s generalizability. Moreover, this study utilizes zero-shot prompt-based categorization using GPT-3.5/4 and Flan-T5, thereby providing a comprehensive benchmark against current approaches and doing comparative tests using leading models such as Robustly Optimized BERT Pretraining Approach (RoBERTa) and Decoding-enhanced BERT with Disentangled Attention (DeBERTa). Experimental results show that the proposed hybrid model achieves competitive performance due to contextual embeddings, which greatly improve classification accuracy. The study explains model decision processes and improves openness using interpretability techniques, including SHapley Additive exPlanations (SHAP) analysis and attention visualization. These results emphasize the need to incorporate rapid engineering techniques alongside deep learning architectures for effective and interpretable SDG text categorization. With possible effects on more general uses in policy analysis and scientific literature mining, this work offers a scalable and transparent solution for automating the evaluation of SDG research.
Achieving goals effectively reflects the success of the institution. However, unless this indicator is coupled with efficiency when achieving goals, the institution will be equal in its achievements, and distinction will remain unachieved. Perhaps the role of the teaching staff in pushing the institution or college towards brilliance focuses on their ability to motivate people on the one hand and their interest in achieving brilliance for the institution. On the other hand, the importance of the research lies in the institution’s reaching a prominent position through the brilliance and creativity of teaching and achieving competition between institutions that make it more brilliance. The study seeks to achieve the goal of the real
... Show MoreThe article considers semantic and stylistic motivations for using obsolete lexicon (historicisms) in the text of a work of art. The specifics of the functioning of this process are presented against the background of the features of the contemporary Russian literary language. Attention is focused on the fact that the layer of obsolete lexical units belongs to a number of nationally specific vocabulary, the development of which forms an understanding of the nature of the actualized language. In addition, it should be noted that the semantics of historicisms is culturally commensurate: the latter is explained by the fact that the deactuation of linguistic units is positioned as parallel to the sociocultural and political changes.
... Show MoreThis study intends to examine the efficiency of student-centered learning (SCL) through Google classroom in enhancing the readiness of fourth stage females’ pre-service teachers. The research employs a quasi-experimental design with a control and experimental group to compare the teaching readiness of participants before and after the intervention. The participants were 30 of fourth stage students at the University of Baghdad - College of Education for Women/the department of English and data were collected through observation checklist to assess their teaching experience and questionnaires to assess their perceptions towards using Google Classroom. Two sections were selected, C as a control group and D as the experimental one each with (
... Show Moreخلال الربع الأخير من القرن العشرين ، شهد الاقتصاد العالمي تحولا في مختلف المجالات التجارية والتكنولوجية والمالية التي غيرت هيكلها وأنتجت وضعا جديدا يتمثل بشكل رئيس في زيادة حركة رأس المال الأجنبي والتوسع السريع للإنتاج الدولي والتجارة بالإضافة إلى التطور التكنولوجي الهائل ونقل التكنولوجيا ، مما أدى إلى هوس الدول بالمنافسة على المستوى العالمي والسعي لدخول الأسواق الدولية وتحسين قدرتها التنافسية. وت
... Show More This paper describes the application of consensus optimization for Wireless Sensor Network (WSN) system. Consensus algorithm is usually conducted within a certain number of iterations for a given graph topology. Nevertheless, the best Number of Iterations (NOI) to reach consensus is varied in accordance with any change in number of nodes or other parameters of . graph topology. As a result, a time consuming trial and error procedure will necessary be applied
to obtain best NOI. The implementation of an intellig ent optimization can effectively help to get the optimal NOI. The performance of the consensus algorithm has considerably been improved by the inclusion of Particle Swarm Optimization (PSO). As a case s
Mobile ad-hoc networks (MANETs) are composed of mobile nodes communicating through wireless medium, without any fixed centralized infrastructure. Providing quality of service (QoS) support to multimedia streaming applications over MANETs is vital. This paper focuses on QoS support, provided by the stream control transmission protocol (SCTP) and the TCP-friendly rate control (TFRC) protocol to multimedia streaming applications over MANETs. In this study, three QoS parameters were considered jointly: (1) packet delivery ratio (PDR), (2) end-to-end delay, (3) and throughput. Specifically, the authors analyzed and compared the simulated performance of the SCTP and TFRC transport protocols for delivering multimedia streaming over MANETs.
... Show MoreAttacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover. The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels wit
... Show MoreThe research aims to
1 – The discloser of the level of moral values in the children of kindergarten.
2 - Building an educational program designed to develop moral values on the children of kindergarten.
3 - Knowing the impact of the program in the development of moral values in children
Purposive sample was selected consisted of 40 children and a child aged 5-6 years and to achieve objectives of the research promising measure of the moral values kindergarten has been applied to the children of the two groups was based on pre and post test
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show More