Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the ECG data in the 2-
D form. The compression algorithms were implemented and tested using multiwavelet, wavelet and slantlet transforms to form the proposed method based on mixed transforms. Then vector quantization technique was employed to extract the mixed transform coefficients. Some selected records from MIT/BIH arrhythmia database were tested contrastively and the performance of the
proposed methods was analyzed and evaluated using MATLAB package. Simulation results showed that the proposed methods gave a high compression ratio (CR) for the ECG signals comparing with other available methods. For example, the compression of one record (record 100) yielded CR of 24.4 associated with percent root mean square difference (PRD) of 2.56% was achieved.
Object tracking is one of the most important topics in the fields of image processing and computer vision. Object tracking is the process of finding interesting moving objects and following them from frame to frame. In this research, Active models–based object tracking algorithm is introduced. Active models are curves placed in an image domain and can evolve to segment the object of interest. Adaptive Diffusion Flow Active Model (ADFAM) is one the most famous types of Active Models. It overcomes the drawbacks of all previous versions of the Active Models specially the leakage problem, noise sensitivity, and long narrow hols or concavities. The ADFAM is well known for its very good capabilities in the segmentation process. In this
... Show MoreDrilling fluid loss during drilling operation is undesirable, expensive and potentially hazardous problem.
Nasiriyah oil field is one of the Iraqi oil field that suffer from lost circulation problem. It is known that Dammam, um-Radoma, Tayarat, Shiranish and Hartha are the detecting layers of loss circulation problem. Different type of loss circulation materials (LCMs) ranging from granular, flakes and fibrous were used previously to treat this problem.
This study presents the application of rice as a lost circulation material that used to mitigate and stop the loss problem when partial or total losses occurred.
The experim
... Show MoreThe flow measurements have increased importance in the last decades due to the shortage of water resources resulting from climate changes that request high control of the available water needed for different uses. The classical technique of open channel flow measurement by the integrating-float method was needed for measuring flow in different locations when there were no available modern devices for different reasons, such as the cost of devices. So, the use of classical techniques was taken place to solve the problem. The present study examines the integrating float method and defines the parameters affecting the acceleration of floating spheres in flowing water that was analyzed using experimental measurements. The me
... Show MoreGroupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MoreExchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreThis paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate inj
... Show MoreAttack stream cipher system , using cipher text only , depends on the characteristics of plain teKt language and the randomness of the key , that used in encryption , without having detailed k.nuwh:dgt:: uf cipher algorithm by benefiting from the balance between O's and I' in the key to reduce the probability of key space.