In this study, a traumatic spinal cord injury (TSCI) classification system is proposed using a convolutional neural network (CNN) technique with automatically learned features from electromyography (EMG) signals for a non-human primate (NHP) model. A comparison between the proposed classification system and a classical classification method (k-nearest neighbors, kNN) is also presented. Developing such an NHP model with a suitable assessment tool (i.e., classifier) is a crucial step in detecting the effect of TSCI using EMG, which is expected to be essential in the evaluation of the efficacy of new TSCI treatments. Intramuscular EMG data were collected from an agonist/antagonist tail muscle pair for the pre- and post-spinal cord lesion from five Macaca fasicularis monkeys. The proposed classifier is based on a CNN using filtered segmented EMG signals from the pre- and post-lesion periods as inputs, while the kNN is designed using four hand-crafted EMG features. The results suggest that the CNN provides a promising classification technique for TSCI, compared to conventional machine learning classification. The kNN with hand-crafted EMG features classified the pre- and post-lesion EMG data with an F-measure of 89.7% and 92.7% for the left- and right-side muscles, respectively, while the CNN with the EMG segments classified the data with an F-measure of 89.8% and 96.9% for the left- and right-side muscles, respectively. Finally, the proposed deep learning classification model (CNN), with its learning ability of high-level features using EMG segments as inputs, shows high potential and promising results for use as a TSCI classification system. Future studies can confirm this finding by considering more subjects.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Abstract
In this paper, fatigue damage accumulation were studied using many methods i.e.Corton-Dalon (CD),Corton-Dalon-Marsh(CDM), new non-linear model and experimental method. The prediction of fatigue lifetimes based on the two classical methods, Corton-Dalon (CD)andCorton-Dalon-Marsh (CDM), are uneconomic and non-conservative respectively. However satisfactory predictions were obtained by applying the proposed non-linear model (present model) for medium carbon steel compared with experimental work. Many shortcomings of the two classical methods are related to their inability to take into account the surface treatment effect as shot peening. It is clear that the new model shows that a much better and cons
... Show MoreBreast cancer constitutes about one fourth of the registered cancer cases among the Iraqi population (1)
and it is the leading cause of death among Iraqi women (2)
. Each year more women are exposed to the vicious
ramifications of this disease which include death if left unmanaged or the negative sequels that they would
experience, cosmetically and psychologically, after exposure to radical mastectomy.
The World Health Organization (WHO) documented that early detection and screening, when coped
with adequate therapy, could offer a reduction in breast cancer mortality; displaying that the low survival rates
in less developed countries, including Iraq, is mainly attributed to the lack of early detection programs couple
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreThe provision of safe water for people is a human right; historically, a major number of people depend on groundwater as a source of water for their needs, such as agricultural, industrial or human activities. Water resources have recently been affected by organic and/or inorganic contaminants as a result of population growth and increased anthropogenic activity, soil leaching and pollution. Water resource remediation has become a serious environmental concern, since it has a direct impact on many aspects of people’s lives. For decades, the pump-and-treat method has been considered the predominant treatment process for the remediation of contaminated groundwater with organic and inorganic contaminants. On the other side, this tech
... Show MoreThe using of recycled aggregates from construction and demolition waste (CDW) can preserve natural aggregate resources, reduce the demand for landfill, and contribute to a sustainable built environment. Concrete demolition waste has been proven to be an excellent source of aggregates for new concrete production. At a technical, economic, and environmental level, roller compacted concrete (RCC) applications benefit various civil construction projects. Roller Compacted Concrete (RCC) is a homogenous mixture that is best described as a zero-slump concrete placed with compacting equipment, uses in storage areas, dams, and most often as a basis for rigid pavements. The mix must be sufficiently dry to support
... Show More