Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reduction in EEG complexity and decrease in EEG connectivity were investigated. Support vector machine and linear discriminate analysis methods were used to find the best combination of the EEG biomarkers to detect AD with significant performance. A total of 325,567 EEG biomarkers were investigated, and a panel of six biomarkers was identified and used to create a diagnostic model with high performance (≥85% for sensitivity and 100% for specificity).
This paper argues the accuracy of behavior based detection systems, in which the Application Programming Interfaces (API) calls are analyzed and monitored. The work identifies the problems that affecting the accuracy of such detection models. The work was extracted (4744) API call through analyzing. The new approach provides an accurate discriminator and can reveal malicious API in PE malware up to 83.2%. Results of this work evaluated with Discriminant Analysis
Statistical control charts are widely used in industry for process and measurement control . in this paper we study the use of markov chain approach in calculating the average run length (ARL) of cumulative sum (Cusum) control chart for defect the shifts in the mean of process , and exponentially weighted moving average (EWMA) control charts for defect the shifts for process mean and , the standard deviation . Also ,we used the EWMA charts based on the logarithm of the sample variance for monitoring a process standard deviation when the observations (products are selected from al_mamun factory ) are identically and independently distributed (iid) from normal distribution in continuous manufacturing .
Geophysics is one of the branches of Earth sciences and deals with studying the Earth's interior by studying the variation of physical properties within rock layers. Applied geophysics depends on procedures that involve the measurements of potential fields, such as the gravitational method. One of the significant oil fields in southern Iraq is represented by the Nahr Omar structure. A power spectrum analysis (SPA) technique was used to collect gravity data within the chosen oil field area in order to confirm the salt dome in the subsurface layers. The analysis of SPA resulted from six surfaces representing the gravity variation values of the depths (m)14300, 3780, 3290, 2170, 810, and 93.5. Gravity surfaces have been converted to de
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreThe purpose of the current investigation is to distinguish between working memory ( ) in five patients with vascular dementia ( ), fifteen post-stroke patients with mild cognitive impairment ( ), and fifteen healthy control individuals ( ) based on background electroencephalography (EEG) activity. The elimination of EEG artifacts using wavelet (WT) pre-processing denoising is demonstrated in this study. In the current study, spectral entropy ( ), permutation entropy ( ), and approximation entropy ( ) were all explored. To improve the classification using the k-nearest neighbors ( NN) classifier scheme, a comparative study of using fuzzy neighbourhood preserving analysis with -decomposition ( ) as a dimensionality reduction technique an
... Show MoreGeneralized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.
With the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data.
... Show More