Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental stages (pre-and post-lesion) using electromyography signals. Eight time-domain features were extracted from the collected electromyography data. To overcome the imbalanced dataset issue, synthetic minority oversampling technique was applied. Different ML classification techniques were applied including multilayer perceptron, support vector machine, K-nearest neighbors, and radial basis function network; then their performances were compared. A confusion matrix and five other statistical metrics (sensitivity, specificity, precision, accuracy, and F-measure) were used to evaluate the performance of the generated classifiers. The results showed that the best classifier for the left- and right-side data is the multilayer perceptron with a total F-measure of 79.5% and 86.0% for the left and right sides, respectively. This work will help to build a reliable classifier that can differentiate between these two phases by utilizing some extracted time-domain electromyography features.
The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreUnder cyclic loading, aluminum alloys exhibit less fatigue life than steel alloys of similar strength and this is considered as Achilles's heel of such alloys. A nanosecond fiber laser was used to apply high speed laser shock peening process on thin aluminum plates in order to enhance the fatigue life by introducing compressive residual stresses. The effect of three working parameters namely the pulse repetition rate (PRR), spot size (ω) and scanning speed (v) on limiting the fatigue failure was investigated. The optimum results, represented by the longer fatigue life, were at PRR of 22.5 kHz, ω of 0.04 mm and at both v's of 200 and 500 mm/sec. The research yielded significant results represented by a maximum percentage increase in the fa
... Show MoreIn this paper we use Bernstein polynomials for deriving the modified Simpson's 3/8 , and the composite modified Simpson's 3/8 to solve one dimensional linear Volterra integral equations of the second kind , and we find that the solution computed by this procedure is very close to exact solution.
In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreThis paper presents a study of a syndrome coding scheme for different binary linear error correcting codes that refer to the code families such as BCH, BKLC, Golay, and Hamming. The study is implemented on Wyner’s wiretap channel model when the main channel is error-free and the eavesdropper channel is a binary symmetric channel with crossover error probability (0 < Pe ≤ 0.5) to show the security performance of error correcting codes while used in the single-staged syndrome coding scheme in terms of equivocation rate. Generally, these codes are not designed for secure information transmission, and they have low equivocation rates when they are used in the syndrome coding scheme. Therefore, to improve the transmiss
... Show MoreIn recent years, the demand for air travel has increased and many people have traveled by plane. Most passengers, however, feel stressed due to the limited cabin space. In order to make these passengers more comfortable, a personal air-conditioning system for the entire chair is needed. This is because the human body experiences discomfort from localized heating or cooling, and thus, it is necessary to provide appropriate airflow to each part of the body. In this paper, a personal air-conditioning system, which consists of six vertically installed air-conditioning vents, will be proposed. To clarify the setting temperature of each vent, the airflow around the passenger and the operative temperature of each part of the body is investigate
... Show MoreOptimizing the Access Point (AP) deployment is of great importance in wireless applications owing the requirement to provide efficient and cost-effective communication. Highly targeted by many researchers and academic industries, Quality of Service (QOS) is an important primary parameter and objective in mind along with AP placement and overall publishing cost. This study proposes and investigates a multi-level optimization algorithm based on Binary Particle Swarm Optimization (BPSO). It aims to an optimal multi-floor AP placement with effective coverage that makes it more capable of supporting QOS and cost effectiveness. Five pairs (coverage, AP placement) of weights, signal threshol
The covid-19 pandemic sweeping the world and has rendered a large proportion of the workforce as they are unable to commute to work. This has resulted in employees and employers seeking alternative work arrangements, including the software industry. Then comes the need for the global market and international presence of many companies to implement the global virtual teams (GVTs). GVTs members are gradually engaged in globalized business environments across space, time and organizational boundaries via information and communication technologies. Despite the advancement of technology, the project managers are still facing many challenges in communication. Hense, to become a successful project manager still a big challenge for them. This study
... Show MoreAn efficient modification and a novel technique combining the homotopy concept with Adomian decomposition method (ADM) to obtain an accurate analytical solution for Riccati matrix delay differential equation (RMDDE) is introduced in this paper . Both methods are very efficient and effective. The whole integral part of ADM is used instead of the integral part of homotopy technique. The major feature in current technique gives us a large convergence region of iterative approximate solutions .The results acquired by this technique give better approximations for a larger region as well as previously. Finally, the results conducted via suggesting an efficient and easy technique, and may be addressed to other non-linear problems.