Sentiment analysis is one of the major fields in natural language processing whose main task is to extract sentiments, opinions, attitudes, and emotions from a subjective text. And for its importance in decision making and in people's trust with reviews on web sites, there are many academic researches to address sentiment analysis problems. Deep Learning (DL) is a powerful Machine Learning (ML) technique that has emerged with its ability of feature representation and differentiating data, leading to state-of-the-art prediction results. In recent years, DL has been widely used in sentiment analysis, however, there is scarce in its implementation in the Arabic language field. Most of the previous researches address other languages like English. The proposed model tackles Arabic Sentiment Analysis (ASA) by using a DL approach. ASA is a challenging field where Arabic language has a rich morphological structure more than other languages. In this work, Long Short-Term Memory (LSTM) as a deep neural network has been used for training the model combined with word embedding as a first hidden layer for features extracting. The results show an accuracy of about 82% is achievable using DL method.
The intelligent buildings provided various incentives to get highly inefficient energy-saving caused by the non-stationary building environments. In the presence of such dynamic excitation with higher levels of nonlinearity and coupling effect of temperature and humidity, the HVAC system transitions from underdamped to overdamped indoor conditions. This led to the promotion of highly inefficient energy use and fluctuating indoor thermal comfort. To address these concerns, this study develops a novel framework based on deep clustering of lagrangian trajectories for multi-task learning (DCLTML) and adding a pre-cooling coil in the air handling unit (AHU) to alleviate a coupling issue. The proposed DCLTML exhibits great overall control and is
... Show MoreArtificial intelligence techniques are reaching us in several forms, some of which are useful but can be exploited in a way that harms us. One of these forms is called deepfakes. Deepfakes is used to completely modify video (or image) content to display something that was not in it originally. The danger of deepfake technology impact on society through the loss of confidence in everything is published. Therefore, in this paper, we focus on deepfakedetection technology from the view of two concepts which are deep learning and forensic tools. The purpose of this survey is to give the reader a deeper overview of i) the environment of deepfake creation and detection, ii) how deep learning and forensic tools contributed to the detection
... Show MoreEmpirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, F
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreOptical burst switching (OBS) network is a new generation optical communication technology. In an OBS network, an edge node first sends a control packet, called burst header packet (BHP) which reserves the necessary resources for the upcoming data burst (DB). Once the reservation is complete, the DB starts travelling to its destination through the reserved path. A notable attack on OBS network is BHP flooding attack where an edge node sends BHPs to reserve resources, but never actually sends the associated DB. As a result the reserved resources are wasted and when this happen in sufficiently large scale, a denial of service (DoS) may take place. In this study, we propose a semi-supervised machine learning approach using k-means algorithm
... Show MoreAfter the information revolution that occurred in the Western world, and the developments in all fields, especially in the field of education and e-learning, from an integrated system based on the effective employment of information and communication technology in the teaching and learning processes through an environment rich in computer and Internet applications, the community and the learner were able to access information sources and learning at any time and place, in a way that achieves mutual interaction between the elements of the system and the surrounding environment. After the occurrence of the phenomenon of Covid 19, it led to a major interruption in all educational systems that had never happened before, and the disrupt
... Show MoreCorpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreLinguistic research according to modern curricula:
It is one of the important matters that occupy the ideas of those concerned with linguistic studies, whether Arabic or otherwise. Recent years have witnessed the advancement of this methodological approach, and books and studies in Arabic have been written on important, multifaceted issues, of grammatical and linguistic origins, and their balance with new developments and ideas attracted mostly from Western studies.
The comparative approach - as they call it - is one of the modern approaches that is based on balancing a language with other sisters belonging to its family, to reach similarities and differences between them, and to know the c
To date, comprehensive reviews and discussions of the strengths and limitations of Remote Sensing (RS) standalone and combination approaches, and Deep Learning (DL)-based RS datasets in archaeology have been limited. The objective of this paper is, therefore, to review and critically discuss existing studies that have applied these advanced approaches in archaeology, with a specific focus on digital preservation and object detection. RS standalone approaches including range-based and image-based modelling (e.g., laser scanning and SfM photogrammetry) have several disadvantages in terms of spatial resolution, penetrations, textures, colours, and accuracy. These limitations have led some archaeological studies to fuse/integrate multip
... Show MoreDiagnosing heart disease has become a very important topic for researchers specializing in artificial intelligence, because intelligence is involved in most diseases, especially after the Corona pandemic, which forced the world to turn to intelligence. Therefore, the basic idea in this research was to shed light on the diagnosis of heart diseases by relying on deep learning of a pre-trained model (Efficient b3) under the premise of using the electrical signals of the electrocardiogram and resample the signal in order to introduce it to the neural network with only trimming processing operations because it is an electrical signal whose parameters cannot be changed. The data set (China Physiological Signal Challenge -cspsc2018) was ad
... Show More