Preferred Language
Articles
/
joe-2305
Compression of an ECG Signal Using Mixed Transforms
...Show More Authors

Electrocardiogram (ECG) is an important physiological signal for cardiac disease diagnosis. With the increasing use of modern electrocardiogram monitoring devices that generate vast amount of data requiring huge storage capacity. In order to decrease storage costs or make ECG signals suitable and ready for transmission through common communication channels, the ECG data
volume must be reduced. So an effective data compression method is required. This paper presents an efficient technique for the compression of ECG signals. In this technique, different transforms have been used to compress the ECG signals. At first, a 1-D ECG data was segmented and aligned to a 2-D data array, then 2-D mixed transform was implemented to compress the ECG data in the 2-
D form. The compression algorithms were implemented and tested using multiwavelet, wavelet and slantlet transforms to form the proposed method based on mixed transforms. Then vector quantization technique was employed to extract the mixed transform coefficients. Some selected records from MIT/BIH arrhythmia database were tested contrastively and the performance of the
proposed methods was analyzed and evaluated using MATLAB package. Simulation results showed that the proposed methods gave a high compression ratio (CR) for the ECG signals comparing with other available methods. For example, the compression of one record (record 100) yielded CR of 24.4 associated with percent root mean square difference (PRD) of 2.56% was achieved.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Using Artificial Neural Network Models For Forecasting & Comparison
...Show More Authors

The Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Journal Of Engineering
Risk Assessment in BOT Contracts using AHP Technique
...Show More Authors

The process of risk assessment in the build-operate transfer (BOT) project is very important to identify and analyze the risks in order to make the appropriate decision to respond to them. In this paper, AHP Technique was used to make the appropriate decision regarding response to the most prominent risks that were generated in BOT projects, which includes a comparison between the criteria for each risk as well as the available alternatives and by mathematical methods using matrices to reach an appropriate decision to respond to each risk.Ten common risks in BOT contracts are adopted for analysis in this paper, which is grouped into six main risk headings.The procedures followed in this paper are the questionnaire method

... Show More
View Publication Preview PDF
Crossref (5)
Crossref
Publication Date
Fri Sep 09 2022
Journal Name
Research Anthology On Improving Medical Imaging Techniques For Analysis And Intervention
Groupwise Non-Rigid Image Alignment Using Few Parameters
...Show More Authors

Groupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff

... Show More
View Publication
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Sun Oct 01 2023
Journal Name
Indonesian Journal Of Electrical Engineering And Computer Science
Intelligence framework dust forecasting using regression algorithms models
...Show More Authors

<span>Dust is a common cause of health risks and also a cause of climate change, one of the most threatening problems to humans. In the recent decade, climate change in Iraq, typified by increased droughts and deserts, has generated numerous environmental issues. This study forecasts dust in five central Iraqi districts using machine learning and five regression algorithm supervised learning system framework. It was assessed using an Iraqi meteorological organization and seismology (IMOS) dataset. Simulation results show that the gradient boosting regressor (GBR) has a mean square error of 8.345 and a total accuracy ratio of 91.65%. Moreover, the results show that the decision tree (DT), where the mean square error is 8.965, c

... Show More
View Publication
Scopus (3)
Scopus Crossref
Publication Date
Mon Oct 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Nurse Scheduling Problem Using Hybrid Simulated Annealing Algorithm
...Show More Authors

Nurse scheduling problem is one of combinatorial optimization problems and it is one of NP-Hard problems which is difficult to be solved as optimal solution. In this paper, we had created an proposed algorithm which it is hybrid simulated annealing algorithm to solve nurse scheduling problem, developed the simulated annealing algorithm and Genetic algorithm. We can note that the proposed algorithm (Hybrid simulated Annealing Algorithm(GS-h)) is the best method among other methods which it is used in this paper because it satisfied minimum average of the total cost and maximum number of Solved , Best and Optimal problems. So we can note that the ratios of the optimal solution are 77% for the proposed algorithm(GS-h), 28.75% for Si

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Dec 13 2017
Journal Name
Al-khwarizmi Engineering Journal
Produced Water Treatment Using Ultrafiltration and Nanofiltration Membranes
...Show More Authors

The application of ultrafiltration (UF) and nanofiltration (NF) processes in the handling of raw produced water have been investigated in the present study. Experiments of both ultrafiltration and nanofiltration processes are performed in a laboratory unit, which is operated in a cross-flow pattern. Various types of hollow fiber membranes were utilized in this study such as poly vinyl chloride (PVC) UF membrane, two different polyether sulfone (PES) NF membranes, and poly phenyl sulfone PPSU NF membrane. It was found that the turbidity of the treated water is higher than 95 % by using UF and NF membranes. The chemical oxygen demand COD (160 mg/l) and Oil content (26.8 mg/l) were found after treatment according to the allowable limits set

... Show More
View Publication Preview PDF
Publication Date
Sun Apr 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Determine Optimal Preventive Maintenance Time Using Scheduling Method
...Show More Authors

In this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.

The method of estimating the distribution parameters for each device was the OLS method.

The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 30 2015
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Using Microbubbles to Improve Transmission Oil in Pipes
...Show More Authors

Drag reduction (DR) techniques are used to improve the flow by spare the flow energy. The applications of DR are conduits in oil pipelines, oil well operations and flood water disposal, many techniques for drag reduction are used. One of these techniques is microbubbles.  In this work, reduce of drag percent occurs by using a small bubbles of air pumped in the fluid transported. Gasoil is used as liquid transporting in the pipelines and air pumped as microbubbles. This study shows that the maximum value of drag reduction is 25.11%.

View Publication Preview PDF
Publication Date
Tue Mar 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Estimate the Nonparametric Regression Function Using Canonical Kernel
...Show More Authors

    This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel  and give the sound amount of smoothing .

We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima

... Show More
View Publication Preview PDF
Crossref