The electrical activity of the heart and the electrocardiogram (ECG) signal are fundamentally related. In the study that has been published, the ECG signal has been examined and used for a number of applications. The monitoring of heart rate and the analysis of heart rhythm patterns, the detection and diagnosis of cardiac diseases, the identification of emotional states, and the use of biometric identification methods are a few examples of applications in the field. Several various phases may be involved in the analysis of electrocardiogram (ECG) data, depending on the type of study being done. Preprocessing, feature extraction, feature selection, feature modification, and classification are frequently included in these stages. Every stage must be finished in order for the analysis to go smoothly. Additionally, accurate success measures and the creation of an acceptable ECG signal database are prerequisites for the analysis of electrocardiogram (ECG) signals. Identification and diagnosis of various cardiac illnesses depend heavily on the ECG segmentation and feature extraction procedure. Electrocardiogram (ECG) signals are frequently obtained for a variety of purposes, including the diagnosis of cardiovascular conditions, the identification of arrhythmias, the provision of physiological feedback, the detection of sleep apnea, routine patient monitoring, the prediction of sudden cardiac arrest, and the creation of systems for identifying vital signs, emotional states, and physical activities. The ECG has been widely used for the diagnosis and prognosis of a variety of heart diseases. Currently, a range of cardiac diseases can be accurately identified by computerized automated reports, which can then generate an automated report. This academic paper aims to provide an overview of the most important problems associated with using deep learning and machine learning to diagnose diseases based on electrocardiography, as well as a review of research on these techniques and methods and a discussion of the major data sets used by researchers.
Cloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many
... Show MoreArtificial neural networks usage, as a developed technique, increased in many fields such as Auditing business. Contemporary auditor should cope with the challenges of the technology evolution in the business environment by using computerized techniques such as Artificial neural networks, This research is the first work made in the field of modern techniques of the artificial neural networks in the field of auditing; it is made by using thesample of neural networks as a sample of the artificial multi-layer Back Propagation neural networks in the field of detecting fundamental mistakes of the financial statements when making auditing. The research objectives at offering a methodology for the application of theartificial neural networks wi
... Show MoreMany of the Iraqi agricultural researches are used spraying technique to add chemical products including pesticides and growth regulators. Various studies were performed to study the effect of these substances at different concentrations to improve plant production. In order to adopt specific criteria of spraying researches and to replicate them easily, it is a necessary to mention all information related to the spraying processes and regulations for improving sprayer’s performance by increasing the amount of pesticide deposited on the target. The current study aims to survey Iraqi researches in details and analyse them randomly. Also, to highlight on the importance of information applied in sprayi
Continuous flow injection analysis (CFIA) is one of the simplest, easiest, and multilateral analytical automation methods in moist chemical analysis. This method depends on changing the physical and chemical properties of a part of the specimen spread out from the specimen injected into the carrier stream. The CFIA technique uses automatic analysis of samples with high efficiency. The CFIA PC compatibility also allows specimens to be treated automatically, reagents to be added, and reaction conditions to be closely monitored. The CFIA is one of the automated chemical analysis methods in which a successive specimen sample is to be estimated and injected into a vector stream from a flowing solution that meets the reagent and mixes at a spe
... Show MoreWastewater recycling for non-potable uses has gained significant attention to mitigate the high pressure on freshwater resources. This requires using a sustainable technique to treat natural municipal wastewater as an alternative to conventional methods, especially in arid and semi-arid rural areas. One of the promising techniques applied to satisfy the objective of wastewater reuse is the constructed wetlands (CWs) which have been used extensively in most countries worldwide through the last decades. The present study introduces a significant review of the definition, classification, and components of CWs, identifying the mechanisms controlling the removal process within such units. Vertical, horizontal, and hybrid CWs
... Show MoreHeart sound is an electric signal affected by some factors during the signal's recording process, which adds unwanted information to the signal. Recently, many studies have been interested in noise removal and signal recovery problems. The first step in signal processing is noise removal; many filters are used and proposed for treating this problem. Here, the Hankel matrix is implemented from a given signal and tries to clean the signal by overcoming unwanted information from the Hankel matrix. The first step is detecting unwanted information by defining a binary operator. This operator is defined under some threshold. The unwanted information replaces by zero, and the wanted information keeping in the estimated matrix. The resulting matrix
... Show More