Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental stages (pre-and post-lesion) using electromyography signals. Eight time-domain features were extracted from the collected electromyography data. To overcome the imbalanced dataset issue, synthetic minority oversampling technique was applied. Different ML classification techniques were applied including multilayer perceptron, support vector machine, K-nearest neighbors, and radial basis function network; then their performances were compared. A confusion matrix and five other statistical metrics (sensitivity, specificity, precision, accuracy, and F-measure) were used to evaluate the performance of the generated classifiers. The results showed that the best classifier for the left- and right-side data is the multilayer perceptron with a total F-measure of 79.5% and 86.0% for the left and right sides, respectively. This work will help to build a reliable classifier that can differentiate between these two phases by utilizing some extracted time-domain electromyography features.
In this research, Haar wavelets method has been utilized to approximate a numerical solution for Linear state space systems. The solution technique is used Haar wavelet functions and Haar wavelet operational matrix with the operation to transform the state space system into a system of linear algebraic equations which can be resolved by MATLAB over an interval from 0 to . The exactness of the state variables can be enhanced by increasing the Haar wavelet resolution. The method has been applied for different examples and the simulation results have been illustrated in graphics and compared with the exact solution.
Market share is a major indication of business success. Understanding the impact of numerous economic factors on market share is critical to a company’s success. In this study, we examine the market shares of two manufacturers in a duopoly economy and present an optimal pricing approach for increasing a company’s market share. We create two numerical models based on ordinary differential equations to investigate market success. The first model takes into account quantity demand and investment in R&D, whereas the second model investigates a more realistic relationship between quantity demand and pricing.
In networking communication systems like vehicular ad hoc networks, the high vehicular mobility leads to rapid shifts in vehicle densities, incoherence in inter-vehicle communications, and challenges for routing algorithms. It is necessary that the routing algorithm avoids transmitting the pockets via segments where the network density is low and the scale of network disconnections is high as this could lead to packet loss, interruptions and increased communication overhead in route recovery. Hence, attention needs to be paid to both segment status and traffic. The aim of this paper is to present an intersection-based segment aware algorithm for geographic routing in vehicular ad hoc networks. This algorithm makes available the best route f
... Show MoreBiodiesel as an attractive energy source; a low-cost and green synthesis technique was utilized for biodiesel preparation via waste cooking oil methanolysis using waste snail shell derived catalyst. The present work aimed to investigate the production of biodiesel fuel from waste materials. The catalyst was greenly synthesized from waste snail shells throughout a calcination process at different calcination time of 2–4 h and temperature of 750–950 ◦C. The catalyst samples were characterized using X-Ray Diffraction (XRD), Brunauer-Emmett-Teller (BET), Energy Dispersive X-ray (EDX), and Fourier Transform Infrared (FT-IR). The reaction variables varying in the range of 10:1–30:1 M ratio of MeOH: oil, 3–11 wt% catalyst loading, 50–
... Show MoreThe aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreDo’a and Zikr al-Mā’thur (authentic supplications and remembrance of ALLAH ‘Azza wa Jalla) can be suggested to Muslims to help them deal with challenges or issues in life. Counselling cases affect a person’s feelings. Do’a and Zikr al-Mā’thur are often applied as a counselling intervention. Unfortunately, the authentic Do’a and Zikr al-Mā’thur are dispersed in many resources not visible to users, and the fact that not all online resources offer access to accurate Do’a and Zikr al-Mā’thur to users and the dubious Do’a and Zikr al-Mā’thur frequently credited to the Prophet (pbuh). The goal of this research is to develop an ontology
... Show MoreThe searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time. Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle to involve four types of binary code books (i.e. Pour when , Flat when , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding procedure, with very small distortion per block, by designing s
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show More