The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Transformers (BERT), and FastText embeddings follows our approach, which comprises exhaustive preprocessing operations including stemming, stopword deletion, and ways to address class imbalance. Training and evaluation of the hybrid BiLSTM-CNN model on several benchmark datasets, including SDG-labeled corpora and relevant external datasets like GoEmotion and Ohsumed, help provide a complete assessment of the model’s generalizability. Moreover, this study utilizes zero-shot prompt-based categorization using GPT-3.5/4 and Flan-T5, thereby providing a comprehensive benchmark against current approaches and doing comparative tests using leading models such as Robustly Optimized BERT Pretraining Approach (RoBERTa) and Decoding-enhanced BERT with Disentangled Attention (DeBERTa). Experimental results show that the proposed hybrid model achieves competitive performance due to contextual embeddings, which greatly improve classification accuracy. The study explains model decision processes and improves openness using interpretability techniques, including SHapley Additive exPlanations (SHAP) analysis and attention visualization. These results emphasize the need to incorporate rapid engineering techniques alongside deep learning architectures for effective and interpretable SDG text categorization. With possible effects on more general uses in policy analysis and scientific literature mining, this work offers a scalable and transparent solution for automating the evaluation of SDG research.
The planning, designing, construction of excavations and foundations in soft to very soft clay soils are always difficult. They are problematic soil that caused trouble for the structures built on them because of the low shear strength, high water content, and high compressibility. This work investigates the geotechnical behavior of soft clay by using tyre ash material burnt in air. The investigation contains the following tests: physical tests, chemical tests, consolidation test, Compaction tests, shear test, California Bearing Ratio test CBR, and model tests. These tests were done on soil samples prepared from soft clay soil; tyre ash was used in four percentages (2, 4, 6, and 8%). The results of the tests were; The soil samples which
... Show MoreIn this research, we did this qualitative and quantitative study in order to improve the assay of aspirin colorimetrically using visible spectrophotometer. This method depends on aqueous hydrolysis of aspirin and then treating it with the ferric chloride acidic solution to give violet colored complex with salicylic acid, as a result of aspirin hydrolysis, which has a maximum absorption at 530nm. This procedure was applied to determine the purity of aspirin powder and tablet. The results were approximately comparative so that the linearity was observed in the high value of both correlation coefficient (R= 0.998) and Determination Coefficient or Linearity (R2= 0.996) while the molar absorpitivity was 1.3× 103 mole
In this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreThe traditional technique of generating MPSK signals is basically to use IQ modulator that involves analog processing like multiplication and addition where inaccuracies may exist and would lead to imbalance problems that affects the output modulated signal and hence the overall performance of the system. In this paper, a simple method is presented for generating the MPSK using logic circuits that basically generated M-carrier signals each carrier of different equally spaced phase shift. Then these carriers are time multiplexed, according to the data symbols, into the output modulated signal.
Scientific development has occupied a prominent place in the field of diagnosis, far from traditional procedures. Scientific progress and the development of cities have imposed diseases that have spread due to this development, perhaps the most prominent of which is diabetes for accurate diagnosis without examining blood samples and using image analysis by comparing two images of the affected person for no less than a period. Less than ten years ago they used artificial intelligence programs to analyze and prove the validity of this study by collecting samples of infected people and healthy people using one of the Python program libraries, which is (Open-CV) specialized in measuring changes to the human face, through which we can infer the
... Show More
The area of character recognition has received a considerable attention by researchers all over the world during the last three decades. However, this research explores best sets of feature extraction techniques and studies the accuracy of well-known classifiers for Arabic numeral using the Statistical styles in two methods and making comparison study between them. First method Linear Discriminant function that is yield results with accuracy as high as 90% of original grouped cases correctly classified. In the second method, we proposed algorithm, The results show the efficiency of the proposed algorithms, where it is found to achieve recognition accuracy of 92.9% and 91.4%. This is providing efficiency more than the first method.
Gypseous soils are widely distributed and especially in Iraq where arid area of hot climatic is present. These soils are considered as problematic soils; therefore this work attends to improve the geotechnical properties of such soil and reduce the dangers of collapse due to wetting process. In this research, undisturbed soil sample of 30 % gypsum content from Karbala city is used. The Single Oedometer collapse test is used in order to investigate the collapse characteristics of natural soil and after treatment with 3%, 6%, 9%, 12% and 15% of Cutback Asphalt. Moreover, two selected additive percentages (9% and 12%) are used to evaluate the suitability of using the Cutback Asphalt for improvement of the bearing capacity o
... Show MoreSmart water flooding (low salinity water flooding) was mainly invested in a sandstone reservoir. The main reasons for using low salinity water flooding are; to improve oil recovery and to give a support for the reservoir pressure.
In this study, two core plugs of sandstone were used with different permeability from south of Iraq to explain the effect of water injection with different ions concentration on the oil recovery. Water types that have been used are formation water, seawater, modified low salinity water, and deionized water.
The effects of water salinity, the flow rate of water injected, and the permeability of core plugs have been studied in order to summarize the best conditions of low salinity
... Show More