In recent years, predicting heart disease has become one of the most demanding tasks in medicine. In modern times, one person dies from heart disease every minute. Within the field of healthcare, data science is critical for analyzing large amounts of data. Because predicting heart disease is such a difficult task, it is necessary to automate the process in order to prevent the dangers connected with it and to assist health professionals in accurately and rapidly diagnosing heart disease. In this article, an efficient machine learning-based diagnosis system has been developed for the diagnosis of heart disease. The system is designed using machine learning classifiers such as Support Vector Machine (SVM), Nave Bayes (NB), and K-Nearest Neighbor (KNN). The proposed work depends on the UCI database from the University of California, Irvine for the diagnosis of heart diseases. This dataset is preprocessed before running the machine learning model to get better accuracy in the classification of heart diseases. Furthermore, a 5-fold cross-validation operator was employed to avoid identical values being selected throughout the model learning and testing phase. The experimental results show that the Naive Bayes algorithm has achieved the highest accuracy of 97% compared to other ML algorithms implemented.
The study of fixed points on the maps fulfilling certain contraction requirements has several applications and has been the focus of numerous research endeavors. On the other hand, as an extension of the idea of the best approximation, the best proximity point (ƁƤƤ) emerges. The best approximation theorem ensures the existence of an approximate solution; the best proximity point theorem is considered for addressing the problem in order to arrive at an optimum approximate solution. This paper introduces a new kind of proximal contraction mapping and establishes the best proximity point theorem for such mapping in fuzzy normed space ( space). In the beginning, the concept of the best proximity point was introduced. The concept of prox
... Show MoreThe best proximity point is a generalization of a fixed point that is beneficial when the contraction map is not a self-map. On other hand, best approximation theorems offer an approximate solution to the fixed point equation . It is used to solve the problem in order to come up with a good approximation. This paper's main purpose is to introduce new types of proximal contraction for nonself mappings in fuzzy normed space and then proved the best proximity point theorem for these mappings. At first, the definition of fuzzy normed space is given. Then the notions of the best proximity point and - proximal admissible in the context of fuzzy normed space are presented. The notion of α ̃–ψ ̃- proximal contractive mapping is introduced.
... Show MoreThe security of message information has drawn more attention nowadays, so; cryptography has been used extensively. This research aims to generate secured cipher keys from retina information to increase the level of security. The proposed technique utilizes cryptography based on retina information. The main contribution is the original procedure used to generate three types of keys in one system from the retina vessel's end position and improve the technique of three systems, each with one key. The distances between the center of the diagonals of the retina image and the retina vessel's end (diagonal center-end (DCE)) represent the first key. The distances between the center of the radius of the retina and the retina vessel's end (ra
... Show MoreThe laser micro-cutting process is the most widely commonly applied machining process which can be applied to practically all metallic and non-metallic materials. While this had challenges in cutting quality criteria such as geometrical precision, surface quality and numerous others. This article investigates the laser micro-cutting of PEEK composite material using nano-fiber laser, due to their significant importunity and efficiency of laser in various manufacturing processes. Design of experiential tool based on Response Surface Methodology (RSM)-Central Composite Design (CCD) used to generate the statistical model. This method was employed to analysis the influence of parameters including laser speed,
... Show MoreProjects suspensions are between the most insistent tasks confronted by the construction field accredited to the sector’s difficulty and its essential delay risk foundations’ interdependence. Machine learning provides a perfect group of techniques, which can attack those complex systems. The study aimed to recognize and progress a wellorganized predictive data tool to examine and learn from delay sources depend on preceding data of construction projects by using decision trees and naïve Bayesian classification algorithms. An intensive review of available data has been conducted to explore the real reasons and causes of construction project delays. The results show that the postpo
The field of autonomous robotic systems has advanced tremendously in the last few years, allowing them to perform complicated tasks in various contexts. One of the most important and useful applications of guide robots is the support of the blind. The successful implementation of this study requires a more accurate and powerful self-localization system for guide robots in indoor environments. This paper proposes a self-localization system for guide robots. To successfully implement this study, images were collected from the perspective of a robot inside a room, and a deep learning system such as a convolutional neural network (CNN) was used. An image-based self-localization guide robot image-classification system delivers a more accura
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show More