The second leading cause of death and one of the most common causes of disability in the world is stroke. Researchers have found that brain–computer interface (BCI) techniques can result in better stroke patient rehabilitation. This study used the proposed motor imagery (MI) framework to analyze the electroencephalogram (EEG) dataset from eight subjects in order to enhance the MI-based BCI systems for stroke patients. The preprocessing portion of the framework comprises the use of conventional filters and the independent component analysis (ICA) denoising approach. Fractal dimension (FD) and Hurst exponent (Hur) were then calculated as complexity features, and Tsallis entropy (TsEn) and dispersion entropy (DispEn) were assessed as irregularity parameters. The MI-based BCI features were then statistically retrieved from each participant using two-way analysis of variance (ANOVA) to demonstrate the individuals’ performances from four classes (left hand, right hand, foot, and tongue). The dimensionality reduction algorithm, Laplacian Eigenmap (LE), was used to enhance the MI-based BCI classification performance. Utilizing k-nearest neighbors (KNN), support vector machine (SVM), and random forest (RF) classifiers, the groups of post-stroke patients were ultimately determined. The findings show that LE with RF and KNN obtained 74.48% and 73.20% accuracy, respectively; therefore, the integrated set of the proposed features along with ICA denoising technique can exactly describe the proposed MI framework, which may be used to explore the four classes of MI-based BCI rehabilitation. This study will help clinicians, doctors, and technicians make a good rehabilitation program for people who have had a stroke.
Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating auto
Back ground: Several devices with different physical bases have been developed for the clinical measurement of corneal thickness, they classified into 4 categories: Scheimpflug photography based, Slit –Scanning topography, optical coherence tomography (OCT) based and ultrasound (US) based.Objective:To evaluatethe precision of the new Scheimpflug –Placido disc corneal topography in measurement of corneal thickness and to compare the measured values with that obtained by US pachymetry.Methods: Setting of this study is Lasik center in Eye Specialty Private Hospital. Baghdad. Iraq.Eyes of healthy subjects were examined with the Sirius topography.3 consecutive measurements of central (CCT)and thinnest (TCT) corneal thicknesses were obtain
... Show MoreObjective: this search aims to test the correlation between job complexity and psychological detachment then stats how the burnout can affect in this relationship and dose the burnout can contribute in development of this relationship. Theoretical framework: the research adopted some questions like how can psychological detachment can make the employee keeping away from work and isolates himself from work environment and how can the job complexity enhance this behavior for employee ,and how can the burnout increase the correlation between job complexity and psychological detachment ?, then trying to extraction some of recommendations may contributes in enhancing practicing and adopting these three variables (job complexity, psych
... Show MoreThe magazine covers the subjects of (simplicity and complexity in the visual layout of the designs of magazine coverings of children) Chapter I The problem and importance of the goal and limits of research and addition to Terminology. In the second chapter the theoretical framework in which the first topic the role of simplicity and complexity in the design and the second section Visual design and design relations and the third research design in magazine covers and the third chapter includes the methodology of research and analysis and the results and conclusions and sources and the most important result of the research is (simplicity and complexity of design, Continuity and vitality and attract the attention of the receiver through the
... Show MoreBiosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG sig
... Show MoreElectrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the
Conditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
This investigation presents an experimental and analytical study on the behavior of reinforced concrete deep beams before and after repair. The original beams were first loaded under two points load up to failure, then, repaired by epoxy resin and tested again. Three of the test beams contains shear reinforcement and the other two beams have no shear reinforcement. The main variable in these beams was the percentage of longitudinal steel reinforcement (0, 0.707, 1.061, and 1.414%). The main objective of this research is to investigate the possibility of restoring the full load carrying capacity of the reinforced concrete deep beam with and without shear reinforcement by using epoxy resin as the material of repair. All be
... Show More