The increasing amount of educational data has rapidly in the latest few years. The Educational Data Mining (EDM) techniques are utilized to detect the valuable pattern so that improves the educational process and to obtain high performance of all educational elements. The proposed work contains three stages: preprocessing, features selection, and an active classification stage. The dataset was collected using EDM that had a lack in the label data, it contained 2050 records collected by using questionnaires and by using the students’ academic records. There are twenty-five features that were combined from the following five factors: (curriculum, teacher, student, the environment of education, and the family). Active learning had been utilized in the classification. Four techniques had been applied for classifying the features: Random Forest (RF) algorithm, Label Propagation (LP), Logistic Regression (LR), and Multilayer Perceptron (MLP). The accuracies of prediction were 95.121%, 92.195%, 92.292%, and 93.951% respectively. Also, the RF algorithm has been utilized for assorting the features depending on their importance.
The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat
... Show MoreIn this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove
... Show MoreMixed-effects conditional logistic regression is evidently more effective in the study of qualitative differences in longitudinal pollution data as well as their implications on heterogeneous subgroups. This study seeks that conditional logistic regression is a robust evaluation method for environmental studies, thru the analysis of environment pollution as a function of oil production and environmental factors. Consequently, it has been established theoretically that the primary objective of model selection in this research is to identify the candidate model that is optimal for the conditional design. The candidate model should achieve generalizability, goodness-of-fit, parsimony and establish equilibrium between bias and variab
... Show MoreIn this study, we present different methods of estimating fuzzy reliability of a two-parameter Rayleigh distribution via the maximum likelihood estimator, median first-order statistics estimator, quartile estimator, L-moment estimator, and mixed Thompson-type estimator. The mean-square error MSE as a measurement for comparing the considered methods using simulation through different values for the parameters and unalike sample sizes is used. The results of simulation show that the fuzziness values are better than the real values for all sample sizes, as well as the fuzzy reliability at the estimation of the Maximum likelihood Method, and Mixed Thompson Method perform better than the other methods in the sense of MSE, so that
... Show MoreIn the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the veloci
... Show MoreThe Arabic grammatical theory is characterized by the characteristics that distinguish it from other languages. It is based on the following equation: In its entirety a homogeneous linguistic system that blends with the social nature of the Arab, his beliefs, and his culture.
This means that this theory was born naturally, after the labor of maintaining an integrated inheritance, starting with its legal text (the Koran), and ends with its features of multiple attributes.
Saber was carrying the founding crucible of that theory, which takes over from his teacher, Hebron, to be built on what it has reached. It is redundant to point to his location and the status of his book.
So came to my research tagged: (c
The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec
... Show MoreThe internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat
... Show MoreIn this paper, a handwritten digit classification system is proposed based on the Discrete Wavelet Transform and Spike Neural Network. The system consists of three stages. The first stage is for preprocessing the data and the second stage is for feature extraction, which is based on Discrete Wavelet Transform (DWT). The third stage is for classification and is based on a Spiking Neural Network (SNN). To evaluate the system, two standard databases are used: the MADBase database and the MNIST database. The proposed system achieved a high classification accuracy rate with 99.1% for the MADBase database and 99.9% for the MNIST database