The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreIn the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by surface chemical reactio
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreThat the essential contribution of this research is a description of how complex systems analysis service of the properties of the queue in Baghdad Teaching Hospital using a technique network is techniques method (Q - GERT) an acronym of the words:
Queuing theory _ Graphical Evaluation and Review Technique
Any method of assessment and review chart where you will be see the movement flow of patients within the system and after using this portal will be represented system in the form of planned network probabilistic analysis and knowledge of statistical distributions appropriate for times of arrival and departure were using the program ready (Win QSB) and simulatio
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show More