Recurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning algorithms implementation in the recurrent stroke prediction models. This research aims to investigate and compare the performance of machine learning algorithms using recurrent stroke clinical public datasets. In this study, Artificial Neural Network (ANN), Support Vector Machine (SVM) and Bayesian Rule List (BRL) are used and compared their performance in the domain of recurrent stroke prediction model. The result of the empirical experiments shows that ANN scores the highest accuracy at 80.00%, follows by BRL with 75.91% and SVM with 60.45%.
In this work, a test room was built in Baghdad city, with (2*1.5*1.5) m3 in dimensions, while the solar chimneys (SC) were designed with aspect ratio (ar) bigger than 12. Test room was supplied by many solar collectors; vertical single side of air pass with ar equals 25, and tilted 45o double side of air passes with ar equals 50 for each pass, both collectors consist of flat thermal energy storage box collector (TESB) that covered by transparent clear acrylic sheet, third type of collector is array of evacuated tubular collectors with thermosyphon in 45o instelled in the bottom of TESB of vertical SC. The TESB was
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreThe objective of this research is to know the extent to which Iraqi and Arab companies apply the criteria of accounting for sustainability and disclosure, as well as to analyze the content of the annual financial reports of the companies listed in the financial market to determine their compliance with the Sustainability Accounting Standards Board )SASB(. Annual Report The commitment of telecommunications companies to implement sustainability issues related to the standard of telecommunications services reached a general average of (54%) for the sample of the research sample. This means that there is a degree of admissibility in applying the standard. As well as the highest level of reporting to the criterion of the (Jordan Telec
... Show More<p>The demand for internet applications has increased rapidly. Providing quality of service (QoS) requirements for varied internet application is a challenging task. One important factor that is significantly affected on the QoS service is the transport layer. The transport layer provides end-to-end data transmission across a network. Currently, the most common transport protocols used by internet application are TCP (Transmission Control Protocol) and UDP (User Datagram Protocol). Also, there are recent transport protocols such as DCCP (data congestion control protocol), SCTP (stream congestion transmission protocol), and TFRC (TCP-friendly rate control), which are in the standardization process of Internet Engineering Task
... Show MoreFace recognition, emotion recognition represent the important bases for the human machine interaction. To recognize the person’s emotion and face, different algorithms are developed and tested. In this paper, an enhancement face and emotion recognition algorithm is implemented based on deep learning neural networks. Universal database and personal image had been used to test the proposed algorithm. Python language programming had been used to implement the proposed algorithm.
Evaluation of “Holy Quran & Islamic Education” Curriculum for Second Intermediate Stage from Perspectives of Teachers & Supervisors of material A Field Study conducted in the Directorates-General of Education in Baghdad Governorate. It goes without saying that educational curricula for students of all stages of schooling are in bad need of reviewing, evaluation and revision. The Islamic education curriculum is no exception, since it is a basic subject that plays a role in developing the individual’s moral and conscientious aspects, promotes his/her inner discipline and helps establish coherence with the values system of the community to which he/she belongs.
Based on the foregoing, the evaluation process of
... Show MoreThe research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.
The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res
... Show MoreBearing capacity of soil is an important factor in designing shallow foundations. It is directly related to foundation dimensions and consequently its performance. The calculations for obtaining the bearing capacity of a soil needs many varying parameters, for example soil type, depth of foundation, unit weight of soil, etc. which makes these calculation very variable–parameter dependent. This paper presents the results of comparison between the theoretical equation stated by Terzaghi and the Artificial Neural Networks (ANN) technique to estimate the ultimate bearing capacity of the strip shallow footing on sandy soils. The results show a very good agreement between the theoretical solution and the ANN technique. Results revealed that us
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreThe steady state laminar mixed convection and radiation through inclined rectangular duct with an interior circular tube is investigated numerically for a thermally and hydrodynamicaly fully developed flow. The two heat transfer mechanisms of convection and radiation are treated independently and simultaneously. The governing equations which used are continuity, momentum and energy equations. These equations are normalized and solved using the Vorticity-Stream function and the Body Fitted Coordinates (B.F.C) methods. The finite difference approach with the Line Successive Over-Relaxation (LSOR) method is used to obtain all the computational results. The (B.F.C) method is used to generate the grid of the problem. A computer program (Fortr
... Show More