Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the ECG data in the 2-
D form. The compression algorithms were implemented and tested using multiwavelet, wavelet and slantlet transforms to form the proposed method based on mixed transforms. Then vector quantization technique was employed to extract the mixed transform coefficients. Some selected records from MIT/BIH arrhythmia database were tested contrastively and the performance of the
proposed methods was analyzed and evaluated using MATLAB package. Simulation results showed that the proposed methods gave a high compression ratio (CR) for the ECG signals comparing with other available methods. For example, the compression of one record (record 100) yielded CR of 24.4 associated with percent root mean square difference (PRD) of 2.56% was achieved.
The aim of the research is to identify the extent of the ability to ensure the integrated reports by the auditor in verifying the credibility of these reports, and their implications for the benefit of all parties dealing with the economic unit, as well as measuring the impact of the assurance procedures followed by the auditors and their role in confirming these reports.
The research methodology was designed after studying the previous literature related to the research variables, and then the relationship between these variables was tested, through the use of a questionnaire list. A questionnaire targeting the community of auditors in the local environment, and the results of the study wer
... Show MoreIn order to take measures in controlling soil erosion it is required to estimate soil loss over area of interest. Soil loss due to soil erosion can be estimated using predictive models such as Universal Soil Loss Equation (USLE). The accuracy of these models depends on parameters that are used in equations. One of the most important parameters in equations used in both of models is (C) factor that represents effects of vegetation and other land covers. Estimating land cover by interpretation of remote sensing imagery involves Normalized Difference Vegetation Index (NDVI), an indicator that shows vegetation cover. The aim of this study is estimate (C) factor values for Part of Baghdad city using NDVI derived from satellite Image of Landsat-7
... Show MoreIraqi crude Atmospheric residual fraction supplied from al-Dura refinery was treated to remove metals contaminants by solvent extraction method, with various hydrocarbon solvents and concentrations. The extraction method using three different type solvent (n-hexane, n-heptane, and light naphtha) were found to be effective for removal of oil-soluble metals from heavy atmospheric residual fraction. Different solvents with using three different hydrocarbon solvents (n-hexane, n-heptane, and light naphtha) .different variables were studied solvent/oil ratios (4/1, 8/1, 10/1, 12/1, and 15/1), different intervals of perceptual (15, 30-60, 90 and 120 min) and different temperature (30, 45, 60 and 90 °C) were used. The metals removal percent we
... Show MoreIraqi crude Atmospheric residual fraction supplied from al-Dura refinery was treated to remove metals contaminants by solvent extraction method, with various hydrocarbon solvents and concentrations. The extraction method using three different type solvent (n-hexane, n-heptane, and light naphtha) were found to be effective for removal of oil-soluble metals from heavy atmospheric residual fraction. Different solvents with using three different hydrocarbon solvents (n-hexane, n-heptane, and light naphtha) .different variables were studied solvent/oil ratios (4/1, 8/1, 10/1, 12/1, and 15/1), different intervals of perceptual (15, 30-60, 90 and 120 min) and different temperature (30, 45, 60 and 90 °C) were used. The metals removal perce
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Steganography is a useful technique that helps in securing data in communication using different data carriers like audio, video, image and text. The most popular type of steganography is image steganography. It mostly uses least significant bit (LSB) technique to hide the data but the probability of detecting the hidden data using this technique is high. RGB is a color model which uses LSB to hide the data in three color channels, where each pixel is represented by three bytes to indicate the intensity of red, green and blue in that pixel. In this paper, steganography based RGB image is proposed which depends on genetic algorithm (GA). GA is used to generate random key that represents the best ordering of secret (image/text) blocks to b
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Hyperglycemia is a complication of diabetes (high blood sugar). This condition causes biochemical alterations in the cells of the body, which may lead to structural and functional problems throughout the body, including the eye. Diabetes retinopathy (DR) is a type of retinal degeneration induced by long-term diabetes that may lead to blindness. propose our deep learning method for the early detection of retinopathy using an efficient net B1 model and using the APTOS 2019 dataset. we used the Gaussian filter as one of the most significant image-processing algorithms. It recognizes edges in the dataset and reduces superfluous noise. We will enlarge the retina picture to 224×224 (the Efficient Net B1 standard) and utilize data aug
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More