Preferred Language
Articles
/
1hctP48BVTCNdQwCxmVA
Knee Meniscus Segmentation and Tear Detection Based On Magnitic Resonacis Images: A Review of Literature
...Show More Authors

The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when diagnosing a tissue sample. Small, unnoticeable changes in pixel density may indicate the beginning of cancer or tear tissue in the early stages. These details even expert pathologists might miss. Artificial intelligence (A.I.) and D.L. revolutionized radiology by enhancing efficiency and accuracy of both interpretative and non-interpretive jobs. When you look at AI applications, you should think about how they might work. Convolutional Neural Network (C.N.N.) is a part of D.L. that can be used to diagnose knee problems. There are existing algorithms that can detect and categorize cartilage lesions, meniscus tears on M.R.I., offer an automated quantitative evaluation of healing, and forecast who is most likely to have recurring meniscus tears based on radiographs.

Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (3)
Crossref
Publication Date
Sun Jan 14 2018
Journal Name
Journal Of Engineering
Optimum Design of Power System Stabilizer based on Improved Ant Colony Optimization Algorithm
...Show More Authors

This paper presents an improved technique on Ant Colony Optimization (ACO) algorithm. The procedure is applied on Single Machine with Infinite Bus (SMIB) system with power system stabilizer (PSS) at three different loading regimes. The simulations are made by using MATLAB software. The results show that by using Improved Ant Colony Optimization (IACO) the system will give better performance with less number of iterations as it compared with a previous modification on ACO. In addition, the probability of selecting the arc depends on the best ant performance and the evaporation rate.

 

View Publication Preview PDF
Publication Date
Fri Mar 01 2019
Journal Name
Al-khwarizmi Engineering Journal
Improve Akaike’s Information Criterion Estimation Based on Denoising of Quadrature Mirror Filter Bank
...Show More Authors

Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).

View Publication Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An exploratory study of history-based test case prioritization techniques on different datasets
...Show More Authors

In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref
Publication Date
Mon Oct 24 2022
Journal Name
Energies
Double-Slope Solar Still Productivity Based on the Number of Rubber Scraper Motions
...Show More Authors

In low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is

... Show More
View Publication
Scopus (5)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Wed Jun 24 2015
Journal Name
Chinese Journal Of Biomedical Engineering
Single Channel Fetal ECG Detection Using LMS and RLS Adaptive Filters
...Show More Authors

ECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.

Publication Date
Wed Dec 13 2023
Journal Name
2023 3rd International Conference On Intelligent Cybernetics Technology & Applications (icicyta)
GPT-4 versus Bard and Bing: LLMs for Fake Image Detection
...Show More Authors

The recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (3)
Scopus Crossref
Publication Date
Wed Nov 30 2022
Journal Name
Iraqi Journal Of Science
Breast Cancer Detection using Decision Tree and K-Nearest Neighbour Classifiers
...Show More Authors

      Data mining has the most important role in healthcare for discovering hidden relationships in big datasets, especially in breast cancer diagnostics, which is the most popular cause of death in the world. In this paper two algorithms are applied that are decision tree and K-Nearest Neighbour for diagnosing Breast Cancer Grad in order to reduce its risk on patients. In decision tree with feature selection, the Gini index gives an accuracy of %87.83, while with entropy, the feature selection gives an accuracy of %86.77. In both cases, Age appeared as the  most effective parameter, particularly when Age<49.5. Whereas  Ki67  appeared as a second effective parameter. Furthermore, K- Nearest Neighbor is based on the minimu

... Show More
Scopus (14)
Crossref (8)
Scopus Crossref
Publication Date
Mon Mar 01 2021
Journal Name
Al-khwarizmi Engineering Journal
Hurst Exponent and Tsallis Entropy Markers for Epileptic Detection from Children
...Show More Authors

The aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (2)
Scopus Crossref
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Multi – Linear in Multiple Nonparametric Regression , Detection and Treatment Using Simulation
...Show More Authors

             It is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the

... Show More
View Publication Preview PDF
Crossref