Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning algorithms implementation in the recurrent stroke prediction models. This research aims to investigate and compare the performance of machine learning algorithms using recurrent stroke clinical public datasets. In this study, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Bayesian Rule List (BRL) are used and compared their performance in the domain of recurrent stroke prediction model. The result of the empirical experiments shows that ANN scores the highest accuracy at 80.00%, follows by BRL with 75.91% and SVM with 60.45%.
Assessment the actual accuracy of laboratory devices prior to first use is very important to know the capabilities of such devices and employ them in multiple domains. As the manual of the device provides information and values in laboratory conditions for the accuracy of these devices, thus the actual evaluation process is necessary.
In this paper, the accuracy of laser scanner (stonex X-300) cameras were evaluated, so that those cameras attached to the device and lead supporting role in it. This is particularly because the device manual did not contain sufficient information about those cameras.
To know the accuracy when using these cameras in close range photogrammetry, laser scanning (stonex X-300) de
... Show MoreThe limitations of wireless sensor nodes are power, computational capabilities, and memory. This paper suggests a method to reduce the power consumption by a sensor node. This work is based on the analogy of the routing problem to distribute an electrical field in a physical media with a given density of charges. From this analogy a set of partial differential equations (Poisson's equation) is obtained. A finite difference method is utilized to solve this set numerically. Then a parallel implementation is presented. The parallel implementation is based on domain decomposition, where the original calculation domain is decomposed into several blocks, each of which given to a processing element. All nodes then execute computations in parall
... Show Moretock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.
This study introduced the effect of using magnetic abrasive finishing method (MAF) for finishing flat surfaces. The results of experiment allow considering the MAF method as a perspective for finishing flat surfaces, forming optimum physical mechanical properties of surfaces layer, removing the defective layers and decreasing the height of micro irregularities. Study the characteristics which permit judgment parameters of surface quality after MAF method then comparative with grinding
This study intends to examine the efficiency of student-centered learning (SCL) through Google classroom in enhancing the readiness of fourth stage females’ pre-service teachers. The research employs a quasi-experimental design with a control and experimental group to compare the teaching readiness of participants before and after the intervention. The participants were 30 of fourth stage students at the University of Baghdad - College of Education for Women/the department of English and data were collected through observation checklist to assess their teaching experience and questionnaires to assess their perceptions towards using Google Classroom. Two sections were selected, C as a control group and D as the experimental one each with (
... Show MoreBackground/Objectives: The purpose of this study was to classify Alzheimer’s disease (AD) patients from Normal Control (NC) patients using Magnetic Resonance Imaging (MRI). Methods/Statistical analysis: The performance evolution is carried out for 346 MR images from Alzheimer's Neuroimaging Initiative (ADNI) dataset. The classifier Deep Belief Network (DBN) is used for the function of classification. The network is trained using a sample training set, and the weights produced are then used to check the system's recognition capability. Findings: As a result, this paper presented a novel method of automated classification system for AD determination. The suggested method offers good performance of the experiments carried out show that the
... Show MoreThis paper examines the change in planning pattern In Lebanon, which relies on vehicles as a semi-single mode of transport, and directing it towards re-shaping the city and introducing concepts of "smooth or flexible" mobility in its schemes; the concept of a "compact city" with an infrastructure based on a flexible mobility culture. Taking into consideration environmental, economical and health risks of the existing model, the paper focuses on the four foundations of the concepts of "city based on culture flexible mobility, "and provides a SWOT analysis to encourage for a shift in the planning methodology.
Deepfake is a type of artificial intelligence used to create convincing images, audio, and video hoaxes and it concerns celebrities and everyone because they are easy to manufacture. Deepfake are hard to recognize by people and current approaches, especially high-quality ones. As a defense against Deepfake techniques, various methods to detect Deepfake in images have been suggested. Most of them had limitations, like only working with one face in an image. The face has to be facing forward, with both eyes and the mouth open, depending on what part of the face they worked on. Other than that, a few focus on the impact of pre-processing steps on the detection accuracy of the models. This paper introduces a framework design focused on this asp
... Show MoreThe increasing complexity of assaults necessitates the use of innovative intrusion detection systems (IDS) to safeguard critical assets and data. There is a higher risk of cyberattacks like data breaches and unauthorised access since cloud services have been used more frequently. The project's goal is to find out how Artificial Intelligence (AI) could enhance the IDS's ability to identify and classify network traffic and identify anomalous activities. Online dangers could be identified with IDS. An intrusion detection system, or IDS, is required to keep networks secure. We must create efficient IDS for the cloud platform as well, since it is constantly growing and permeating more aspects of our daily life. However, using standard intrusion
... Show MoreThe ongoing research to improve the clinical outcome of titanium implants has resulted in the implementation of multiple approaches to deliver osteogenic growth factors accelerating and sustaining osseointegration. Here we show the presentation of human bone morphogenetic protein 7 (BMP-7) adsorbed to titanium discs coated with poly(ethyl acrylate) (PEA). We have previously shown that PEA promotes fibronectin organization into nanonetworks exposing integrin- and growth-factor-binding domains, allowing a synergistic interaction at the integrin/growth factor receptor level. Here, titanium discs were coated with PEA and fibronectin and then decorated with ng/mL doses of BMP-7. Human mesenchymal stem cells were used to investigate cellular resp
... Show More