Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficacy of several classification algorithms on four reputable datasets, using both the full features set and the reduced features subset selected through the proposed method. The results show that the feature selection technique achieves outstanding classification accuracy, precision, and recall, with an impressive 97% accuracy when used with the Extra Tree classifier algorithm. The research reveals the promising potential of the feature selection method for improving classifier accuracy by focusing on the most informative features and simultaneously decreasing computational burden.
Metaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreSingle mode-no core-single mode fiber structure with a section of tuned no-core fiber diameter to sense changes in relative humidity has been experimentally demonstrated. The sensor performance with tuned NCF diameter was investigated to maximize the evanescent fields. Different tuned diameters of of (100, 80, and 60)μm were obtained by chemical etching process based on hydrofluoric acid immersion. The highest wavelength sensitivity was obtained 184.57 pm/RH% in the RH range of 30% –100% when the no-core fiber diameter diameter was 60 μm and the sensor response was in real-time measurements
In this study multi objective optimization is utilized to optimize a turning operation to reveal the appropriate level of process features. The goal of this work is to evaluate the optimal combination of cutting parameters like feed, spindle speed, inclination angle and workpiece material to have a best surface quality Taguchi technique L9 mixed orthogonal array, has been adopted to optimize the roughness of surface. Three rods of length around (200 mm) for the three metals are used for this work. Each rod is divided into three parts with 50 mm length. For brass the optimum parametric mix for minimum Ra is A1, B1 and C3, i.e., at tool inclination angle (5), feedrate of 0.01, spindle speed of 120
... Show MoreThe use analysis value chain such information in the provision as financial so information quality meet and satisfy the needs of users such information , particularly investors and lenders as the identification needs financial information and the knowledge as their behavior influenced by that information can be based on the accounting profession to focus on improving their function in order to achieve its goal that satisfying their needs and rationalize their decisions . In accounting thought discovered fertile ground for users preferences as one of the entrances theorising positive which is based on the need to include knowledge on accounting hypothesis that explain the
... Show MoreCrime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
<p>Currently, breast cancer is one of the most common cancers and a main reason of women death worldwide particularly in<strong> </strong>developing countries such as Iraq. our work aims to predict the type of tumor whether benign or malignant through models that were built using logistic regression and neural networks and we hope it will help doctors in detecting the type of breast tumor. Four models were set using binary logistic regression and two different types of artificial neural networks namely multilayer perceptron MLP and radial basis function RBF. Evaluation of validated and trained models was done using several performance metrics like accuracy, sensitivity, specificity, and AUC (area under receiver ope
... Show MoreThis study aims to employ modern spatial simulation models to predict the future growth of Al-Najaf city for the year 2036 by studying the change in land use for the time period (1986-2016) because of its importance in shaping future policy for the planning process and decision-making process and ensuring a sustainable urban future, using Geographical information software programs and remote sensing (GIS, IDRISI Selva) as they are appropriate tools for exploring spatial temporal changes from the local level to the global scale. The application of the Markov chain model, which is a popular model that calculates the probability of future change based on the past, and the Cellular Automa