Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning algorithms implementation in the recurrent stroke prediction models. This research aims to investigate and compare the performance of machine learning algorithms using recurrent stroke clinical public datasets. In this study, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Bayesian Rule List (BRL) are used and compared their performance in the domain of recurrent stroke prediction model. The result of the empirical experiments shows that ANN scores the highest accuracy at 80.00%, follows by BRL with 75.91% and SVM with 60.45%.
The limitations of wireless sensor nodes are power, computational capabilities, and memory. This paper suggests a method to reduce the power consumption by a sensor node. This work is based on the analogy of the routing problem to distribute an electrical field in a physical media with a given density of charges. From this analogy a set of partial differential equations (Poisson's equation) is obtained. A finite difference method is utilized to solve this set numerically. Then a parallel implementation is presented. The parallel implementation is based on domain decomposition, where the original calculation domain is decomposed into several blocks, each of which given to a processing element. All nodes then execute computations in parall
... Show MoreAssessment the actual accuracy of laboratory devices prior to first use is very important to know the capabilities of such devices and employ them in multiple domains. As the manual of the device provides information and values in laboratory conditions for the accuracy of these devices, thus the actual evaluation process is necessary.
In this paper, the accuracy of laser scanner (stonex X-300) cameras were evaluated, so that those cameras attached to the device and lead supporting role in it. This is particularly because the device manual did not contain sufficient information about those cameras.
To know the accuracy when using these cameras in close range photogrammetry, laser scanning (stonex X-300) de
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreIn the last few years, the Internet of Things (IoT) is gaining remarkable attention in both academic and industrial worlds. The main goal of the IoT is laying on describing everyday objects with different capabilities in an interconnected fashion to the Internet to share resources and to carry out the assigned tasks. Most of the IoT objects are heterogeneous in terms of the amount of energy, processing ability, memory storage, etc. However, one of the most important challenges facing the IoT networks is the energy-efficient task allocation. An efficient task allocation protocol in the IoT network should ensure the fair and efficient distribution of resources for all objects to collaborate dynamically with limited energy. The canonical de
... Show Moretock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.
This paper critically looks at the studies that investigated the Social Network Sites in the Arab region asking whether they made a practical addition to the field of information and communication sciences or not. The study tried to lift the ambiguity of the variety of names, as well as the most important theoretical and methodological approaches used by these studies highlighting its scientific limitations. The research discussed the most important concepts used by these studies such as Interactivity, Citizen Journalism, Public Sphere, and Social Capital and showed the problems of using them because each concept comes out of a specific view to these websites. The importation of these concepts from a cultural and social context to an Ara
... Show MoreThis study introduced the effect of using magnetic abrasive finishing method (MAF) for finishing flat surfaces. The results of experiment allow considering the MAF method as a perspective for finishing flat surfaces, forming optimum physical mechanical properties of surfaces layer, removing the defective layers and decreasing the height of micro irregularities. Study the characteristics which permit judgment parameters of surface quality after MAF method then comparative with grinding
The DEM (Digital elevation model) means that the topography of the earth's surface (such as; Terrain relief and ocean floors), can be described mathematically by elevations as functions of three positions either in geographical coordinates, (Lat. Long. System) or in rectangular coordinates systems (X, Y, Z). Therefore, a DEM is an array number that represents spatial distributions of terrain characteristics. In this paper, the contour lines with different interval of high-resolution digital elevation model (1m) for AL-khamisah, The Qar Government was obtained. The altitudes ranging is between 1 m – 8.5 m, so characterized by varying heights within a small spatial region because it represents in multiple spots with flat surfaces.
This paper examines the change in planning pattern In Lebanon, which relies on vehicles as a semi-single mode of transport, and directing it towards re-shaping the city and introducing concepts of "smooth or flexible" mobility in its schemes; the concept of a "compact city" with an infrastructure based on a flexible mobility culture. Taking into consideration environmental, economical and health risks of the existing model, the paper focuses on the four foundations of the concepts of "city based on culture flexible mobility, "and provides a SWOT analysis to encourage for a shift in the planning methodology.
Alzheimer’s Disease (AD) is the most prevailing type of dementia. The prevalence of AD is estimated to be around 5% after 65 years old and is staggering 30% for more than 85 years old in developed countries. AD destroys brain cells causing people to lose their memory, mental functions and ability to continue daily activities. The findings of this study are likely to aid specialists in their decision-making process by using patients’ Magnetic Resonance Imaging (MRI) to distinguish patients with AD from Normal Control (NC). Performance evolution was applied to 346 Magnetic Resonance images from the Alzheimer's Neuroimaging Initiative (ADNI) collection. The Deep Belief Network (DBN) classifier was used to fulfill classification f
... Show More