Preferred Language
Articles
/
ijs-2910
Inspecting Hybrid Data Mining Approaches in Decision Support Systems for Humanities Texts Criticism
...Show More Authors

The majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content.   When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniques in humanities when summarizing and eliciting automated decisions. This process relies on technological advancement and considers (1) the automated-decision support-techniques commonly used in humanities, (2) the performance evolution and the use of the stylometric approach in text-mining, and (3) the comparisons of the results of chunking text by using different attributes in Burrows' Delta method. This study also provides an overview of the efficiency of applying some selected data-mining (DM) methods with various text-mining techniques to support the critics' decision in artistry ‒ one field of humanities. The automatic choice of criticism in this field was supported by a hybrid approach to these procedures.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Feb 21 2022
Journal Name
Iraqi Journal For Computer Science And Mathematics
Fuzzy C means Based Evaluation Algorithms For Cancer Gene Expression Data Clustering
...Show More Authors

The influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Fri Jan 01 2010
Journal Name
Ibn Al- Haitham J. Fo R Pure & Appl. Sci
Evaluation of The Nuclear Data on(α,n)Reaction for Natural Molybdenum
...Show More Authors

The cross section evaluation for (α,n) reaction was calculated according to the available International Atomic Energy Agency (IAEA) and other experimental published data . These cross section are the most recent data , while the well known international libraries like ENDF , JENDL , JEFF , etc. We considered an energy range from threshold to 25 M eV in interval (1 MeV). The average weighted cross sections for all available experimental and theoretical(JENDL) data and for all the considered isotopes was calculated . The cross section of the element is then calculated according to the cross sections of the isotopes of that element taking into account their abundance . A mathematical representative equation for each of the element

... Show More
Publication Date
Thu Jan 06 2022
Journal Name
Kuwait Journal Of Science
AVO analysis for high amplitude anomalies using 2D pre-stack seismic data
...Show More Authors

Amplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the

... Show More
View Publication
Scopus (1)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Aug 31 2021
Journal Name
International Journal Of Intelligent Engineering And Systems
FDPHI: Fast Deep Packet Header Inspection for Data Traffic Classification and Management
...Show More Authors

Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational c

... Show More
View Publication
Scopus (8)
Crossref (5)
Scopus Crossref
Publication Date
Mon Sep 23 2019
Journal Name
Baghdad Science Journal
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I
...Show More Authors

     In this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used:  local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the

... Show More
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
Ieee Access
Proposition of New Ensemble Data-Intelligence Models for Surface Water Quality Prediction
...Show More Authors

View Publication
Scopus (68)
Crossref (58)
Scopus Clarivate Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Annals Of Maxillofacial Surgery
The use of screw retained hybrid arch bar for maxillomandibular fixation in the treatment of mandibular fractures: A comparative study
...Show More Authors

Introduction: The use of screw-retained hybrid arch bars (HABs) is a relatively recent development in the treatment of mandibular fractures. The purpose of this study is to compare the clinical outcome between HAB and the conventional Erich arch bar (EAB) in the closed treatment of mandibular fractures. Materials and methods: This study included 18 patients who were treated for mandibular fractures with maxillomandibular fixation (MMF), patients were randomly assigned into a control group (n = 10) in which EAB was used and study group (n = 8) in which HAB was used. The outcome variables were time required for application and removal, gingival inflammation scores, postoperative complications, and incidence of wire-stick injury or gloves perf

... Show More
View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Sun Dec 01 2013
Journal Name
2013 Sixth International Conference On Developments In Esystems Engineering
Ensure Security of Compressed Data Transmission
...Show More Authors

Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p

... Show More
View Publication
Scopus (4)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sat Dec 30 2023
Journal Name
Journal Of Economics And Administrative Sciences
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroup

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Mar 07 2022
Journal Name
Journal Of Educational And Psychological Researches
The Effectiveness of Cognitive Conflict Strategy in Comprehending Reading Among the Fifth Literary Students in the Subject of Literature and Texts
...Show More Authors

The aim of the current study is to identify the effectiveness of cognitive conflict strategy in comprehending reading among literary fifth students in literature and literature texts. The researcher uses experimental method with partial control. The sample consisted of (80) students distributed into control and experimental groups. The scientific material, the behavioral goals, the teaching plans, and the instrument of the research have been prepared (reading comprehension test) by the researcher.

The instrument's validity and reliability have been calculated and then applied to the sample. After treating the data statistically by using SPSS, the results have revealed that there is a statistically significant difference at the si

... Show More
View Publication Preview PDF