Preferred Language
Articles
/
bsj-4602
Detecting Keratoconus by Using SVM and Decision Tree Classifiers with the Aid of Image Processing
...Show More Authors

 Researchers used different methods such as image processing and machine learning techniques in addition to medical instruments such as Placido disc, Keratoscopy, Pentacam;to help diagnosing variety of diseases that affect the eye. Our paper aims to detect one of these diseases that affect the cornea, which is Keratoconus. This is done by using image processing techniques and pattern classification methods. Pentacam is the device that is used to detect the cornea’s health; it provides four maps that can distinguish the changes on the surface of the cornea which can be used for Keratoconus detection. In this study, sixteen features were extracted from the four refractive maps along with five readings from the Pentacam software. The classifiers utilized in our study are Support Vector Machine (SVM) and Decision Trees classification accuracy was achieved 90% and 87.5%, respectively of detecting Keratoconus corneas. The features were extracted by using the Matlab (R2011 and R 2017) and Orange canvas (Pythonw).       

Scopus Clarivate Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Jun 30 2015
Journal Name
Al-khwarizmi Engineering Journal
Multi-Focus Image Fusion Based on Pixel Significance Using Counterlet Transform
...Show More Authors

Abstract

 The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test  images, and compared with some present methods.

... Show More
View Publication Preview PDF
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Using Statistical Methods to Increase the Contrast Level in Digital Images
...Show More Authors

This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.

 

View Publication Preview PDF
Crossref
Publication Date
Tue Oct 23 2018
Journal Name
Journal Of Economics And Administrative Sciences
Processing of missing values in survey data using Principal Component Analysis and probabilistic Principal Component Analysis methods
...Show More Authors

The idea of ​​carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component  Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Oil spill classification based on satellite image using deep learning techniques
...Show More Authors

 An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Sat Feb 09 2019
Journal Name
Journal Of The College Of Education For Women
Comparative Study of Image Denoising Using Wavelet Transforms and Optimal Threshold and Neighbouring Window
...Show More Authors

NeighShrink is an efficient image denoising algorithm based on the discrete wavelet
transform (DWT). Its disadvantage is to use a suboptimal universal threshold and identical
neighbouring window size in all wavelet subbands. Dengwen and Wengang proposed an
improved method, which can determine an optimal threshold and neighbouring window size
for every subband by the Stein’s unbiased risk estimate (SURE). Its denoising performance is
considerably superior to NeighShrink and also outperforms SURE-LET, which is an up-todate
denoising algorithm based on the SURE. In this paper different wavelet transform
families are used with this improved method, the results show that Haar wavelet has the
lowest performance among

... Show More
View Publication Preview PDF
Publication Date
Tue Jun 23 2020
Journal Name
Baghdad Science Journal
Content Based Image Retrieval (CBIR) by Statistical Methods
...Show More Authors

            An image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T-

... Show More
View Publication Preview PDF
Scopus (12)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Tue Sep 19 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Evaluation of tbe Intensity Distribution by Image Processing
...Show More Authors

Image   processing   is  an   important   source   for   the  image

analytical    in   order    to    get   variable    parameters    such   as   the

intensity  .In  present   work  it  has  been  found   a  relation   between the  tensity and  number    of  pixd  in  the  image  ,  and  from  this relation   we  have   got   in  this  paper   the  inten

... Show More
View Publication Preview PDF
Publication Date
Thu Dec 16 2021
Journal Name
Translational Vision Science & Technology
A Hybrid Deep Learning Construct for Detecting Keratoconus From Corneal Maps
...Show More Authors

View Publication
Scopus (26)
Crossref (26)
Scopus Clarivate Crossref
Publication Date
Tue Jun 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison Between Partial Least Square Regression(PLSR) and Tree Regression by Using Simulation(RT).
...Show More Authors

This research discussed, the process of comparison between the regression model of partial least squares and tree regression, where these models included two types of statistical methods represented by the first type "parameter statistics" of the partial least squares, which is adopted when the number of variables is greater than the number of observations and also when the number of observations larger than the number of variables, the second type is the "nonparametric statistic" represented by tree regression, which is the division of data in a hierarchical way. The regression models for the two models were estimated, and then the comparison between them, where the comparison between these methods was according to a Mean Square

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
Processing of Polymers Stress Relaxation Curves Using Machine Learning Methods
...Show More Authors

Currently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref