The research problem arose from the researchers’ sense of the importance of Digital Intelligence (DI), as it is a basic requirement to help students engage in the digital world and be disciplined in using technology and digital techniques, as students’ ideas are sufficiently susceptible to influence at this stage in light of modern technology. The research aims to determine the level of DI among university students using Artificial Intelligence (AI) techniques. To verify this, the researchers built a measure of DI. The measure in its final form consisted of (24) items distributed among (8) main skills, and the validity and reliability of the tool were confirmed. It was applied to a sample of 139 male and female students who were chosen in a random stratified manner from students at the University of Baghdad, College of Education for Pure Sciences/Ibn Al-Haitham, Department of Computer. The proposed AI model utilized three artificial intelligence techniques: Decision Tree (DT), Random Forest (RF), and Gradient Boosting Machine (GBM). The classification accuracy using DT was 92.85 and using GMB was 95.23. The RF technique was applied to find the essential features, and the Pearson correlation was used to find the correlation between the features. The findings indicated that students indeed possess digital intelligence, underscoring the potential for tailored interventions to enhance their digital skills and competencies. This research not only sheds light on the current DI landscape among university students but also paves the way for targeted educational initiatives to foster digital literacy and proficiency in the academic setting.
In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreCohesion is well known as the study of the relationships, whether grammatical and/or lexical, between the different elements of a particular text by the use of what are commonly called 'cohesive devices'. These devices bring connectivity and bind a text together. Besides, the nature and the amount of such cohesive devices usually affect the understanding of that text in the sense of making it easier to comprehend. The present study is intendedto examine the use of grammatical cohesive devicesin relation to narrative techniques. The story of Joseph from the Holy Quran has been selected to be examined by using Halliday and Hasan's Model of Cohesion (1976, 1989). The aim of the study is to comparatively examine to what extent the type
... Show MoreTo date, comprehensive reviews and discussions of the strengths and limitations of Remote Sensing (RS) standalone and combination approaches, and Deep Learning (DL)-based RS datasets in archaeology have been limited. The objective of this paper is, therefore, to review and critically discuss existing studies that have applied these advanced approaches in archaeology, with a specific focus on digital preservation and object detection. RS standalone approaches including range-based and image-based modelling (e.g., laser scanning and SfM photogrammetry) have several disadvantages in terms of spatial resolution, penetrations, textures, colours, and accuracy. These limitations have led some archaeological studies to fuse/integrate multip
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreRutting is a crucial concern impacting asphalt concrete pavements’ stability and long-term performance, negatively affecting vehicle drivers’ comfort and safety. This research aims to evaluate the permanent deformation of pavement under different traffic and environmental conditions using an Artificial Neural Network (ANN) prediction model. The model was built based on the outcomes of an experimental uniaxial repeated loading test of 306 cylindrical specimens. Twelve independent variables representing the materials’ properties, mix design parameters, loading settings, and environmental conditions were implemented in the model, resulting in a total of 3214 data points. The network accomplished high prediction accuracy with an R
... Show MoreThe concept of narration has taken an aesthetic field farther than the primitive human act which was imposed by the necessities of social communication in an ancient historical period. The research addressed the research problem. The importance of the research lies in connecting the concept of narration with the theatre directing elements. The research aims at discovering the narration fields in the theatre directing represented by the perceived videos, audios and motions. The research time limit was (2014). The theoretical framework is divided into three chapters:
The first chapter (the concept of narration in literature and criticism), the second addressed
... Show MoreThe recent advancements in security approaches have significantly increased the ability to identify and mitigate any type of threat or attack in any network infrastructure, such as a software-defined network (SDN), and protect the internet security architecture against a variety of threats or attacks. Machine learning (ML) and deep learning (DL) are among the most popular techniques for preventing distributed denial-of-service (DDoS) attacks on any kind of network. The objective of this systematic review is to identify, evaluate, and discuss new efforts on ML/DL-based DDoS attack detection strategies in SDN networks. To reach our objective, we conducted a systematic review in which we looked for publications that used ML/DL approach
... Show MoreObjectives: To identify the effectiveness of instructional program concerning premarital screening of sexual transmitted disease on student's knowledge at Baghdad University and examine the relationship between students' knowledge and certain studied variables. And hypothesis for this study; There is a difference in university student’s knowledge toward premarital screening between pre and posttests of instructional program. Methodology: A quasi-experimental design (pretest-posttest approach) was conducted at six colleges and its college of education ibn rushd, college of political science, college of law, college of literatur