Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental stages (pre-and post-lesion) using electromyography signals. Eight time-domain features were extracted from the collected electromyography data. To overcome the imbalanced dataset issue, synthetic minority oversampling technique was applied. Different ML classification techniques were applied including multilayer perceptron, support vector machine, K-nearest neighbors, and radial basis function network; then their performances were compared. A confusion matrix and five other statistical metrics (sensitivity, specificity, precision, accuracy, and F-measure) were used to evaluate the performance of the generated classifiers. The results showed that the best classifier for the left- and right-side data is the multilayer perceptron with a total F-measure of 79.5% and 86.0% for the left and right sides, respectively. This work will help to build a reliable classifier that can differentiate between these two phases by utilizing some extracted time-domain electromyography features.
The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the
... Show MoreBiodiesel as an attractive energy source; a low-cost and green synthesis technique was utilized for biodiesel preparation via waste cooking oil methanolysis using waste snail shell derived catalyst. The present work aimed to investigate the production of biodiesel fuel from waste materials. The catalyst was greenly synthesized from waste snail shells throughout a calcination process at different calcination time of 2–4 h and temperature of 750–950 ◦C. The catalyst samples were characterized using X-Ray Diffraction (XRD), Brunauer-Emmett-Teller (BET), Energy Dispersive X-ray (EDX), and Fourier Transform Infrared (FT-IR). The reaction variables varying in the range of 10:1–30:1 M ratio of MeOH: oil, 3–11 wt% catalyst loading, 50–
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreMCM-48 zeolites have unique properties from the surfaces and structure point of view as it’s shown in the results ,and unique and very sensitive to be prepared, have been experimentally prepared and utilized as a second-generation/ acid - catalyst for esterification reactions of oleic acid as a model oil for a free fatty acid source with Ethanol. The characterization of the catalyst used in the reaction has been identified by various methods indicating the prepared MCM-48 is highly matching the profile of common commercial MCM-48 zeolite. The XRF results show domination of SiO2 on the chemical structure with 99.1% and agreeable with the expected from MCM-48 for it's of silica-based, and the SEM results show the cubic c
... Show MoreBackground: Unlike normal EEG patterns, the epileptiform abnormal pattern is characterized by different mor phologies such as the high-frequency oscillations (HFOs) of ripples on spikes, spikes and waves, continuous and sporadic spikes, and ploy2 spikes. Several studies have reported that HFOs can be novel biomarkers in human epilepsy study. S) Method: To regenerate and investigate these patterns, we have proposed three large scale brain network models (BNM by linking the neural mass model (NMM) of Stefanescu-Jirsa 2D (S-J 2D) with our own structural con nectivity derived from the realistic biological data, so called, large-scale connectivity connectome. These models include multiple network connectivity of brain regions at different
... Show MoreThe study aimed at designing a training program by using training for the anaerobic differential threshold stand and the effects of those trainings on the variables of (Concentration of Lactic Acid and LDH Enzyme, VO2 MaX and Cortisol Hormone). The Researchers used the experimental program with one-group style. Also, they used a sample with (8) men-players in a (free 400 m men-runners) and they used many instruments and procedures, most notably the training-program prepared for 10 weeks and for 3 training units weekly, (70-90 min) for each unit. They used the training intensity from 85-100% of the player's ability. After finishing the training program and doing some pre-tests and post-tests then statistically checking the results, the resea
... Show More