Preferred Language
Articles
/
joe-2300
Image Compression Using 3-D Two-Level Technique
...Show More Authors

In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-mean-square difference (PRD %), energy retained (Er) and Peak Signal to Noise Ratio (PSNR). Based on testing results, a comparison between the three techniques is presented. CR in the three techniques is the same and has the largest value in the 2nd level of 3-D. The hybrid
technique has the highest PSNR values in the 1st and 2nd level of 3-D and has the lowest values of (PRD %). so, the 3-D 2-level hybrid is the best technique for image compression 

View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Application of data content analysis (DEA) technology to evaluate performance efficiency: applied research in the General Tax Authority
...Show More Authors

The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T

... Show More
View Publication Preview PDF
Publication Date
Fri Dec 30 2022
Journal Name
Iraqi Journal Of Science
Design and Construction of Ultraviolet and Incoming Solar Irradiance Sensing Device
...Show More Authors

     In-situ measurements of ultraviolet (UV) and solar irradiance is very sparse in Nigeria because of cost; it is  estimated using meteorological parameters. In this work, a low-cost UV and pyranometer device, using locally sourced materials, was developed. The instrument consists of a UV sensor (ML8511), a photodiode (BPW34) housed in a carefully sealed vacuumed glass bulb, the UV and solar irradiance sensor amplifiers, a 16-bit analog-to-digital converter (ADS1115), Arduino mega 2560, liquid crystal display (LCD) and microSD card for data logging. The designed amplifier has an offset voltage of 0.8676 mV. The sensitivity of the irradiance device is 86.819 Wm-2/mV with a correcting factor of 27.77 Wm-2 and a

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Mon Apr 11 2011
Journal Name
Icgst
Employing Neural Network and Naive Bayesian Classifier in Mining Data for Car Evaluation
...Show More Authors

In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.

Publication Date
Sun Jun 05 2011
Journal Name
Baghdad Science Journal
Magnetic Deflection Coefficient Investigation for Low Energy Particles
...Show More Authors

In this research we solved numerically Boltzmann transport equation in order to calculate the transport parameters, such as, drift velocity, W, D/? (ratio of diffusion coefficient to the mobility) and momentum transfer collision frequency ?m, for purpose of determination of magnetic drift velocity WM and magnetic deflection coefficient ? for low energy electrons, that moves in the electric field E, crossed with magnetic field B, i.e; E×B, in the nitrogen, Argon, Helium and it's gases mixtures as a function of: E/N (ratio of electric field strength to the number density of gas), E/P300 (ratio of electric field strength to the gas pressure) and D/? which covered a different ranges for E/P300 at temperatures 300°k (Kelvin). The results show

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Nov 29 2018
Journal Name
Iraqi Journal Of Science
Application of the Predictive deconvolution on a seismic line Al-Najaf and Al-Muthanna Governorates in Southern Iraq
...Show More Authors

This study deals with the processing  of field seismic data for a seismic line located within the administrative boundaries of Najaf and Muthanna governorates in southern Iraq (7Gn 21) with a length of 54 km. The study was conducted within the Processing Department of the Oil Exploration Company using the Omega  system, which contains a large number of programs that deal with processing, through the use of these programs applied  predictive deconvolution  of both( gap) and (spike). The final section was produced for both types. The gap predictive deconvolution  gave improvement in the shallow reflectors while in deep reflectors it did not give a good improvement, thus giving a good continuity of the reflectors at

... Show More
View Publication Preview PDF
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight

... Show More
View Publication Preview PDF
Publication Date
Mon May 15 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Proposed Methods To Prevent SQL Injection
...Show More Authors

  In the last decade, the web has rapidly become an attractive platform, and an indispensable part of our lives. Unfortunately, as our dependency on the web increases so programmers focus more on functionality and appearance than security, has resulted in the interest of attackers in exploiting serious security problems that target web applications and web-based information systems e.g. through an SQL injection attack.     SQL injection in simple terms, is the process of passing SQL code into interactive web applications that employ database services such applications accept user input  such as form  and then include this input in database requests, typically SQL statements in a way that was not intende

... Show More
View Publication Preview PDF
Publication Date
Sat Jul 31 2021
Journal Name
Iraqi Journal Of Science
A review of Medical Diagnostics Via Data Mining Techniques
...Show More Authors

Data mining is one of the most popular analysis methods in medical research. It involves finding patterns and correlations in previously unknown datasets. Data mining encompasses various areas of biomedical research, including data collection, clinical decision support, illness or safety monitoring, public health, and inquiry research. Health analytics frequently uses computational methods for data mining, such as clustering, classification, and regression. Studies of large numbers of diverse heterogeneous documents, including biological and electronic information, provided extensive material to medical and health studies.

View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
Exploring the Challenges of Diagnosing Thyroid Disease with Imbalanced Data and Machine Learning: A Systematic Literature Review
...Show More Authors

Thyroid disease is a common disease affecting millions worldwide. Early diagnosis and treatment of thyroid disease can help prevent more serious complications and improve long-term health outcomes. However, thyroid disease diagnosis can be challenging due to its variable symptoms and limited diagnostic tests. By processing enormous amounts of data and seeing trends that may not be immediately evident to human doctors, Machine Learning (ML) algorithms may be capable of increasing the accuracy with which thyroid disease is diagnosed. This study seeks to discover the most recent ML-based and data-driven developments and strategies for diagnosing thyroid disease while considering the challenges associated with imbalanced data in thyroid dise

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sat Jul 31 2021
Journal Name
Iraqi Journal Of Science
A Parallel Clustering Analysis Based on Hadoop Multi-Node and Apache Mahout
...Show More Authors

     The conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (1)
Scopus Crossref