Biomarkers to detect Alzheimer’s disease (AD) would enable patients to gain access to appropriate services and may facilitate the development of new therapies. Given the large numbers of people affected by AD, there is a need for a low-cost, easy to use method to detect AD patients. Potentially, the electroencephalogram (EEG) can play a valuable role in this, but at present no single EEG biomarker is robust enough for use in practice. This study aims to provide a methodological framework for the development of robust EEG biomarkers to detect AD with a clinically acceptable performance by exploiting the combined strengths of key biomarkers. A large number of existing and novel EEG biomarkers associated with slowing of EEG, reduction in EEG complexity and decrease in EEG connectivity were investigated. Support vector machine and linear discriminate analysis methods were used to find the best combination of the EEG biomarkers to detect AD with significant performance. A total of 325,567 EEG biomarkers were investigated, and a panel of six biomarkers was identified and used to create a diagnostic model with high performance (≥85% for sensitivity and 100% for specificity).
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreCognitive radios have the potential to greatly improve spectral efficiency in wireless networks. Cognitive radios are considered lower priority or secondary users of spectrum allocated to a primary user. Their fundamental requirement is to avoid interference to potential primary users in their vicinity. Spectrum sensing has been identified as a key enabling functionality to ensure that cognitive radios would not interfere with primary users, by reliably detecting primary user signals. In addition, reliable sensing creates spectrum opportunities for capacity increase of cognitive networks. One of the key challenges in spectrum sensing is the robust detection of primary signals in highly negative signal-to-noise regimes (SNR).In this paper ,
... Show MoreGeneralized Additive Model has been considered as a multivariate smoother that appeared recently in Nonparametric Regression Analysis. Thus, this research is devoted to study the mixed situation, i.e. for the phenomena that changes its behaviour from linear (with known functional form) represented in parametric part, to nonlinear (with unknown functional form: here, smoothing spline) represented in nonparametric part of the model. Furthermore, we propose robust semiparametric GAM estimator, which compared with two other existed techniques.
Copyright hacking and piracy have increased as the Internet has grown in popularity and access to multimedia material has increased. Security, property protection, and authentication have all been achieved via watermarking techniques. This paper presents a summary of some recent efforts on video watermarking techniques, with an emphasis on studies from 2018 to 2022, as well as the various approaches, achievements, and attacks utilized as testing measures against these watermarking systems. According to the findings of this study, frequency-domain watermarking techniques are more popular and reliable than spatial domain watermarking approaches. Hybrid DCT and DWT are the two most used techniques and achieve good results in the fi
... Show MoreWith the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data.
... Show MoreIn this paper a new technique based on dynamic stream cipher algorithm is introduced. The mathematical model of dynamic stream cipher algorithm is based on the idea of changing the structure of the combined Linear Feedback Shift Registers (LFSR's) with each change in basic and message keys to get more complicated encryption algorithm, and this is done by use a bank of LFSR's stored in protected file and we select a collection of LFSR's randomly that are used in algorithm to generate the encryption (decryption) key.
We implement Basic Efficient Criteria on the suggested Key Generator (KG) to test the output key results. The results of applying BEC prove the robustness and efficiency of the proposed stream cipher cryptosystem.
Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show More