Traumatic spinal cord injury is a serious neurological disorder. Patients experience a plethora of symptoms that can be attributed to the nerve fiber tracts that are compromised. This includes limb weakness, sensory impairment, and truncal instability, as well as a variety of autonomic abnormalities. This article will discuss how machine learning classification can be used to characterize the initial impairment and subsequent recovery of electromyography signals in an non-human primate model of traumatic spinal cord injury. The ultimate objective is to identify potential treatments for traumatic spinal cord injury. This work focuses specifically on finding a suitable classifier that differentiates between two distinct experimental stages (pre-and post-lesion) using electromyography signals. Eight time-domain features were extracted from the collected electromyography data. To overcome the imbalanced dataset issue, synthetic minority oversampling technique was applied. Different ML classification techniques were applied including multilayer perceptron, support vector machine, K-nearest neighbors, and radial basis function network; then their performances were compared. A confusion matrix and five other statistical metrics (sensitivity, specificity, precision, accuracy, and F-measure) were used to evaluate the performance of the generated classifiers. The results showed that the best classifier for the left- and right-side data is the multilayer perceptron with a total F-measure of 79.5% and 86.0% for the left and right sides, respectively. This work will help to build a reliable classifier that can differentiate between these two phases by utilizing some extracted time-domain electromyography features.
The Results of Theoretical Studies and Experiment of Advanced Economies , Have Been Proven That Investment Expenditure Is Not The Only Factor And The Main Source of Production Growth, But Efficient Using Of The Fixed Assets Is More Important In This Process, All That Depends On Groups of Factors Called The Non-Investment Economic Growth, That Are Un-bodied Technical Progress With Organizational Nature.
It's Distinguished Features That it has An Influence on The Production Growth Without Any Large Additional Investment Expenditure Or Any Additional Increment in Inputs And That Can Not Be Reached Without Activating The Factors of Non-Investment Economic Growth, Which is Still Affecting Negatively In T
... Show MoreThis research studies the rheological properties ( plastic viscosity, yield point and apparent viscosity) of Non-Newtonian fluids under the effect of temperature using different chemical additives, such as (xanthan gum (xc-polymer), carboxyl methyl cellulose ( High and low viscosity ) ,polyacrylamide, polyvinyl alcohol, starch, Quebracho and Chrome Lignosulfonate). The samples were prepared by mixing 22.5g of bentonite with 350 ml of water and adding the additives in four different concentrations (3, 6, 9, 13) g by using Hamilton Beach mixer. The rheological properties of prepared samples were measured by using Fan viscometer model 8-speeds. All the samples were subjected to Bingham plastic model. The temperature range studi
... Show MoreThe study aims to identify the effects of dubbed Turkish drama on the public through the application of a sample of the views of women. The study also attempts to monitor the causes and motives due to the act of observation and to identify the various effects of this act. In order to achieve these goals, the researcher relies on the descriptive approach in addition to the questionnaire and interviews to collect data. It ends with a number of results such as: The study aims to identify the effects of dubbed Turkish drama on the public through the application of a sample of the views of women. The study also attempts to monitor the causes and motives due to the act of observation and to identify the various effects of this act. In ord
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MoreThe study was preformed for investigating of Salmonella from meat, and compared Vidas UP Salmonella (SPT) with the traditional methods of isolation for Salmonella , were examined 42 meat samples (Beef and Chicken) from the Local and Imported From local markets in the city of Baghdad from period December 2013 -February 2014 the samples were cultured on enrichment and differential media and examined samples Vidas, and confirmed of isolates by cultivation chromgenic agar, biochemical tests ,Api20 E systeme , In addition serological tests , and the serotypes determinate in the Central Public Health Laboratory / National Institute of Salmonella The results showed the contamination in imported meat was more than in the local meat 11.9% and 2
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time t . The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integrated with the FD method t
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreGas hydrate formation is considered one of the major problems facing the oil and gas industry as it poses a significant threat to the production, transportation and processing of natural gas. These solid structures can nucleate and agglomerate gradually so that a large cluster of hydrate is formed, which can clog flow lines, chokes, valves, and other production facilities. Thus, an accurate predictive model is necessary for designing natural gas production systems at safe operating conditions and mitigating the issues induced by the formation of hydrates. In this context, a thermodynamic model for gas hydrate equilibrium conditions and cage occupancies of N2 + CH4 and N2 + CO4 gas mix
The undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding
... Show MoreIn this paper, previous studies about Fuzzy regression had been presented. The fuzzy regression is a generalization of the traditional regression model that formulates a fuzzy environment's relationship to independent and dependent variables. All this can be introduced by non-parametric model, as well as a semi-parametric model. Moreover, results obtained from the previous studies and their conclusions were put forward in this context. So, we suggest a novel method of estimation via new weights instead of the old weights and introduce
Paper Type: Review article.
another suggestion based on artificial neural networks.