Clinical keratoconus (KCN) detection is a challenging and time-consuming task. In the diagnosis process, ophthalmologists must revise demographic and clinical ophthalmic examinations. The latter include slit-lamb, corneal topographic maps, and Pentacam indices (PI). We propose an Ensemble of Deep Transfer Learning (EDTL) based on corneal topographic maps. We consider four pretrained networks, SqueezeNet (SqN), AlexNet (AN), ShuffleNet (SfN), and MobileNet-v2 (MN), and fine-tune them on a dataset of KCN and normal cases, each including four topographic maps. We also consider a PI classifier. Then, our EDTL method combines the output probabilities of each of the five classifiers to obtain a decision based on the fusion of probabilities. Individually, the classifier based on PI achieved 93.1% accuracy, whereas the deep classifiers reached classification accuracies over 90% only in isolated cases. Overall, the average accuracy of the deep networks over the four corneal maps ranged from 86% (SfN) to 89.9% (AN). The classifier ensemble increased the accuracy of the deep classifiers based on corneal maps to values ranging (92.2% to 93.1%) for SqN and (93.1% to 94.8%) for AN. Including in the ensemble-specific combinations of corneal maps’ classifiers and PI increased the accuracy to 98.3%. Moreover, visualization of first learner filters in the networks and Grad-CAMs confirmed that the networks had learned relevant clinical features. This study shows the potential of creating ensembles of deep classifiers fine-tuned with a transfer learning strategy as it resulted in an improved accuracy while showing learnable filters and Grad-CAMs that agree with clinical knowledge. This is a step further towards the potential clinical deployment of an improved computer-assisted diagnosis system for KCN detection to help ophthalmologists to confirm the clinical decision and to perform fast and accurate KCN treatment.
The accurate extracting, studying, and analyzing of drainage basin morphometric aspects is important for the accurate determination of environmental factors that formed them, such as climate, tectonic activity, region lithology, and land covering vegetation.
This work was divided into three stages; the 1st stage was delineation of the Al-Abiadh basin borders using a new approach that depends on three-dimensional modeling of the studied region and a drainage network pattern extraction using (Shuttle Radar Topographic Mission) data, the 2nd was the classification of the Al-Abiadh basin streams according to their shape and widenings, and the 3rd was ex
... Show MoreA total of 37 Staphylococcus epidermidis isolates, isolated from corneal scraping of patients with bacterial keratitis and 20 isolates from healthy eyes (as control) (all isolates, isolated from, Ibn Al- Haietham eye hospital / Baghdad), were tested for slime production, 52.63% of all isolates were positive-slime production (23 isolates from patients and 7 isolates from controls). It was found that positive-slime producing S. epidermidis were exhibited a high resistance to antibiotics as compared to negative-slime producing isolates.
The Diffie-Hellman is a key exchange protocol to provide a way to transfer shared secret keys between two parties, although those parties might never have communicated together. This paper suggested a new way to transfer keys through public or non-secure channels depending on the sent video files over the channel and then extract keys. The proposed method of key generation depends on the video file content by using the entropy value of the video frames. The proposed system solves the weaknesses in the Diffie-Hellman key exchange algorithm, which is MIMA (Man-in-the-Middle attack) and DLA( Discrete logarithm attack). When the method used high definition videos with a vast amount of data, the keys generated with a large number up to 5
... Show MoreGravity and magnetic data are used to study the tectonic situation of Kut- Dewania- Fajir and surrounding areas in central Iraq. The study includes the using of window method with different spacing to separate the residual from regional anomalies of gravity and magnetic data. The Total Horizontal Derivative (THD) techniques used to identify the fault trends in the basement and sedimentary rocks depending upon gravity and magnetic data. The obtained faults trends from gravity data are (N30W), (N60W) (N80E) and (N20E) and from magnetic data are (N30W), (N70E), (N20E),(N10W),(N40E). It is believed that these faults extend from the basement to the lower layers of the sedimentary rocks except the N60W trend that observed clearly in gravity in
... Show MoreIn this research, a study is introduced on the effect of several environmental factors on the performance of an already constructed quality inspection system, which was designed using a transfer learning approach based on convolutional neural networks. The system comprised two sets of layers, transferred layers set from an already trained model (DenseNet121) and a custom classification layers set. It was designed to discriminate between damaged and undamaged helical gears according to the configuration of the gear regardless to its dimensions, and the model showed good performance discriminating between the two products at ideal conditions of high-resolution images. So, this study aimed at testing the system performance at poo
... Show MoreIn this research, a study is introduced on the effect of several environmental factors on the performance of an already constructed quality inspection system, which was designed using a transfer learning approach based on convolutional neural networks. The system comprised two sets of layers, transferred layers set from an already trained model (DenseNet121) and a custom classification layers set. It was designed to discriminate between damaged and undamaged helical gears according to the configuration of the gear regardless to its dimensions, and the model showed good performance discriminating between the two products at ideal conditions of high-resolution images.
So, this study aimed at testing the system performance at poor s
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Our active aim in this paper is to prove the following Let Ŕ be a ring having an
idempotent element e(e 0,e 1) . Suppose that R is a subring of Ŕ which
satisfies:
(i) eR R and Re R .
(ii) xR 0 implies x 0 .
(iii ) eRx 0 implies x 0( and hence Rx 0 implies x 0) .
(iv) exeR(1 e) 0 implies exe 0 .
If D is a derivable map of R satisfying D(R ) R ;i, j 1,2. ij ij Then D is
additive. This extend Daif's result to the case R need not contain any non-zero
idempotent element.
Plagiarism is described as using someone else's ideas or work without their permission. Using lexical and semantic text similarity notions, this paper presents a plagiarism detection system for examining suspicious texts against available sources on the Web. The user can upload suspicious files in pdf or docx formats. The system will search three popular search engines for the source text (Google, Bing, and Yahoo) and try to identify the top five results for each search engine on the first retrieved page. The corpus is made up of the downloaded files and scraped web page text of the search engines' results. The corpus text and suspicious documents will then be encoded as vectors. For lexical plagiarism detection, the system will
... Show More