Autism is a lifelong developmental deficit that affects how people perceive the world and interact with each others. An estimated one in more than 100 people has autism. Autism affects almost four times as many boys than girls. The commonly used tools for analyzing the dataset of autism are FMRI, EEG, and more recently "eye tracking". A preliminary study on eye tracking trajectories of patients studied, showed a rudimentary statistical analysis (principal component analysis) provides interesting results on the statistical parameters that are studied such as the time spent in a region of interest. Another study, involving tools from Euclidean geometry and non-Euclidean, the trajectory of eye patients also showed interesting results. In this research, need confirm the results of the preliminary study but also going forward in understanding the processes involved in these experiments. Two tracks are followed, first will concern with the development of classifiers based on statistical data already provided by the system "eye tracking", second will be more focused on finding new descriptors from the eye trajectories. In this paper, study used K-mean with Vector Measure Constructor Method (VMCM). In addition, briefly reflect used other method support vector machine (SVM) technique. The methods are playing important role to classify the people with and without autism specter disorder. The research paper is comparative study between these two methods.
Abstract
The study aimed: To assess the level of trainers' knowledge about the application of strategies and to find out the relationship between Trainer's knowledge and their socio-demographic characteristics.
Methodology: Using the pre-experimental design of the current study, for one group of 47 trainers working at the private Autism Centers in Baghdad, data was collected from 8/January / 2022 to 13 /February /2022. Using non-probability samples (convenient samples), self-management technology in which trainers fill out the questionnaire form themselves was used in the data collection process; it was analyzed through descriptive and inference statistics.
In this paper, a handwritten digit classification system is proposed based on the Discrete Wavelet Transform and Spike Neural Network. The system consists of three stages. The first stage is for preprocessing the data and the second stage is for feature extraction, which is based on Discrete Wavelet Transform (DWT). The third stage is for classification and is based on a Spiking Neural Network (SNN). To evaluate the system, two standard databases are used: the MADBase database and the MNIST database. The proposed system achieved a high classification accuracy rate with 99.1% for the MADBase database and 99.9% for the MNIST database
Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreThe monitoring weld quality is increasingly important because great financial savings are possible because of it, and this especially happens in manufacturing where defective welds lead to losses in production and necessitate time consuming and expensive repair. This research deals with the monitoring and controllability of the fusion arc welding process using Artificial Neural Network (ANN) model. The effect of weld parameters on the weld quality was studied by implementing the experimental results obtained from welding a non-Galvanized steel plate ASTM BN 1323 of 6 mm thickness in different weld parameters (current, voltage, and travel speed) monitored by electronic systems that are followed by destructive (Tensile and Bending) and non
... Show MoreData generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative st
... Show MoreBiomass has been extensively investigated, because of its presence as clean energy source. Tars and particulates formation problems are still the major challenges in development especially in the implementation of gasification technologies into nowadays energy supply systems. Laser Induced Fluorescence spectroscopy (LIF) method is incorporated for determining aromatic and Polycyclic Aromatic Hydrocarbons (PAH) produced at high temperature gasification technology. The effect of tars deposition when the gases are cooled has been highly reduced by introducing a new concept of measurement cell. The samples of PAH components have been prepared with the standard constrictions of measured PAHs by using gas chromatograph (GC). OPO laser with tun
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show More