Preferred Language
Articles
/
iRe5Po8BVTCNdQwCz2Wy
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eyes' observation of the different colors and features of images. We propose a multi-layer hybrid system for deep learning using the unsupervised CAE architecture and using the color clustering of the K-mean algorithm to compress images and determine their size and color intensity. The system is implemented using Kodak and Challenge on Learned Image Compression (CLIC) dataset for deep learning. Experimental results show that our proposed method is superior to the traditional compression methods of the autoencoder, and the proposed work has better performance in terms of performance speed and quality measures Peak Signal To Noise Ratio (PSNR) and Structural Similarity Index (SSIM) where the results achieved better performance and high efficiency With high compression bit rates and low Mean Squared Error (MSE) rate the results recorded the highest compression ratios that ranged between (0.7117 to 0.8707) for the Kodak dataset and (0.7191 to 0.9930) for CLIC dataset. The system achieved high accuracy and quality in comparison to the error coefficient, which was recorded (0.0126 to reach 0.0003) below, and this system is onsidered the most quality and accurate compared to the methods of deep learning compared to the deep learning methods of the autoencoder

Publication Date
Fri Oct 02 2020
Journal Name
International Journal Of Pharmaceutical Research
A turbidimetric method for the quantitative determination of cyproheptadine hydrochloride in tablets using an optoelectronic detector based on the LEDs array
...Show More Authors

View Publication Preview PDF
Scopus (17)
Crossref (2)
Scopus Crossref
Publication Date
Thu Dec 30 2021
Journal Name
Periodicals Of Engineering And Natural Sciences (pen)
Design a system for an approved video copyright over cloud based on biometric iris and random walk generator using watermark technique
...Show More Authors

View Publication
Scopus (57)
Crossref (12)
Scopus Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Aip Conference Proceedings
Iraqi stock market structure analysis based on minimum spanning tree
...Show More Authors

tock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.

View Publication
Scopus Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Engineering
Behavior of Reinforced Concrete Deep Beams Strengthened with Carbon Fiber Reinforced Polymer Strips
...Show More Authors

This research is concerned to investigate the behavior of reinforced concrete (RC) deep beams strengthened with carbon fiber reinforced polymer (CFRP) strips. The experimental part of this research is carried out by testing seven RC deep beams having the same dimensions and steel reinforcement which have been divided into two groups according to the strengthening schemes. Group one was consisted of three deep beams strengthened with vertical U-wrapped CFRP strips. While, Group two was consisted of three deep beams strengthened with inclined CFRP strips oriented by 45o with the longitudinal axis of the beam. The remaining beam is kept unstrengthening as a reference beam. For each group, the variable considered

... Show More
View Publication Preview PDF
Publication Date
Wed Nov 28 2018
Journal Name
International Journal Of Engineering & Technology
Modified Strut Effectiveness Factor for FRP-Reinforced Concrete Deep Beams
...Show More Authors

A few examinations have endeavored to assess a definitive shear quality of a fiber fortified polymer (FRP)- strengthened solid shallow shafts. Be that as it may, need data announced for examining the solid profound pillars strengthened with FRP bars. The majority of these investigations don't think about the blend of the rigidity of both FRP support and cement. This examination builds up a basic swagger adequacy factor model to evaluate the referenced issue. Two sorts of disappointment modes; concrete part and pulverizing disappointment modes were examined. Protection from corner to corner part is chiefly given by the longitudinal FRP support, steel shear fortification, and cement rigidity. The proposed model has been confirmed util

... Show More
View Publication
Crossref (2)
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
2021 Ieee/cvf Conference On Computer Vision And Pattern Recognition Workshops (cvprw)
Alps: Adaptive Quantization of Deep Neural Networks with GeneraLized PositS
...Show More Authors

View Publication
Scopus (14)
Crossref (14)
Scopus Clarivate Crossref
Publication Date
Thu Jan 01 2015
Journal Name
Iraqi Journal Of Science
Keystroke Dynamics Authentication based on Naïve Bayes Classifier
...Show More Authors

Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user

... Show More
Publication Date
Sat Jun 26 2021
Journal Name
2021 Ieee International Conference On Automatic Control & Intelligent Systems (i2cacis)
Vulnerability Assessment on Ethereum Based Smart Contract Applications
...Show More Authors

View Publication
Scopus (10)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Sun Apr 23 2017
Journal Name
International Conference Of Reliable Information And Communication Technology
Classification of Arabic Writer Based on Clustering Techniques
...Show More Authors

Arabic text categorization for pattern recognitions is challenging. We propose for the first time a novel holistic method based on clustering for classifying Arabic writer. The categorization is accomplished stage-wise. Firstly, these document images are sectioned into lines, words, and characters. Secondly, their structural and statistical features are obtained from sectioned portions. Thirdly, F-Measure is used to evaluate the performance of the extracted features and their combination in different linkage methods for each distance measures and different numbers of groups. Finally, experiments are conducted on the standard KHATT dataset of Arabic handwritten text comprised of varying samples from 1000 writers. The results in the generatio

... Show More
Scopus (6)
Scopus
Publication Date
Sun Nov 01 2020
Journal Name
Journal Of Physics: Conference Series
Improve topic modeling algorithms based on Twitter hashtags
...Show More Authors
Abstract<p>Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned</p> ... Show More
View Publication
Scopus (20)
Crossref (19)
Scopus Crossref