Preferred Language
Articles
/
iRe5Po8BVTCNdQwCz2Wy
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eyes' observation of the different colors and features of images. We propose a multi-layer hybrid system for deep learning using the unsupervised CAE architecture and using the color clustering of the K-mean algorithm to compress images and determine their size and color intensity. The system is implemented using Kodak and Challenge on Learned Image Compression (CLIC) dataset for deep learning. Experimental results show that our proposed method is superior to the traditional compression methods of the autoencoder, and the proposed work has better performance in terms of performance speed and quality measures Peak Signal To Noise Ratio (PSNR) and Structural Similarity Index (SSIM) where the results achieved better performance and high efficiency With high compression bit rates and low Mean Squared Error (MSE) rate the results recorded the highest compression ratios that ranged between (0.7117 to 0.8707) for the Kodak dataset and (0.7191 to 0.9930) for CLIC dataset. The system achieved high accuracy and quality in comparison to the error coefficient, which was recorded (0.0126 to reach 0.0003) below, and this system is onsidered the most quality and accurate compared to the methods of deep learning compared to the deep learning methods of the autoencoder

Publication Date
Fri Aug 31 2012
Journal Name
Al-khwarizmi Engineering Journal
Sub–Nyquist Frequency Efficient Audio Compression
...Show More Authors

This paper presents the application of a framework of fast and efficient compressive sampling based on the concept of random sampling of sparse Audio signal. It provides four important features. (i) It is universal with a variety of sparse signals. (ii) The number of measurements required for exact reconstruction is nearly optimal and much less then the sampling frequency and below the Nyquist frequency. (iii) It has very low complexity and fast computation. (iv) It is developed on the provable mathematical model from which we are able to quantify trade-offs among streaming capability, computation/memory requirement and quality of reconstruction of the audio signal. Compressed sensing CS is an attractive compression scheme due to its uni

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 01 2017
Journal Name
International Journal Of Applied Engineering Research
A Proposed Method for Generating a Private Key Using Digital Color Image Features
...Show More Authors

In this paper, the goal of proposed method is to protect data against different types of attacks by unauthorized parties. The basic idea of proposed method is generating a private key from a specific features of digital color image such as color (Red, Green and Blue); the generating process of private key from colors of digital color image performed via the computing process of color frequencies for blue color of an image then computing the maximum frequency of blue color, multiplying it by its number and adding process will performed to produce a generated key. After that the private key is generated, must be converting it into the binary representation form. The generated key is extracted from blue color of keyed image then we selects a c

... Show More
Publication Date
Mon Jan 01 2024
Journal Name
Lecture Notes In Networks And Systems
Using Machine Learning to Control Congestion in SDN: A Review
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Tue Dec 28 2021
Journal Name
2021 2nd Information Technology To Enhance E-learning And Other Application (it-ela)
Pedestrian and Objects Detection by Using Learning Complexity-Aware Cascades
...Show More Authors

View Publication Preview PDF
Scopus (7)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Ifip Advances In Information And Communication Technology
Rapid Thrombogenesis Prediction in Covid-19 Patients Using Machine Learning
...Show More Authors

Machine Learning (ML) algorithms are increasingly being utilized in the medical field to manage and diagnose diseases, leading to improved patient treatment and disease management. Several recent studies have found that Covid-19 patients have a higher incidence of blood clots, and understanding the pathological pathways that lead to blood clot formation (thrombogenesis) is critical. Current methods of reporting thrombogenesis-related fluid dynamic metrics for patient-specific anatomies are based on computational fluid dynamics (CFD) analysis, which can take weeks to months for a single patient. In this paper, we propose a ML-based method for rapid thrombogenesis prediction in the carotid artery of Covid-19 patients. Our proposed system aims

... Show More
View Publication
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Wed Aug 05 2020
Journal Name
Advances In Civil Engineering
Strength compensation of deep beams with large web openings using carbon fiber–reinforced polymer sheets
...Show More Authors

This article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while

... Show More
Scopus (20)
Crossref (20)
Scopus Clarivate Crossref
Publication Date
Wed Aug 05 2020
Journal Name
Advances In Structural Engineering
Strength compensation of deep beams with large web openings using carbon fiber–reinforced polymer sheets
...Show More Authors

This article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while

... Show More
View Publication
Crossref (20)
Crossref
Publication Date
Sat Sep 30 2017
Journal Name
Al-khwarizmi Engineering Journal
Robot Arm Path Planning Using Modified Particle Swarm Optimization based on D* algorithm
...Show More Authors

Abstract

Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al

... Show More
View Publication Preview PDF
Crossref (8)
Crossref
Publication Date
Mon Feb 01 2016
Journal Name
Journal Of Engineering
Valuation of Construction Projects Based on of Quantity Scale by using Expert System
...Show More Authors

The subject of an valuation of quality of construction projects is one of the topics which it becomes necessary of the absence of the quantity standards in measuring the control works and the quality valuation standards in constructional projects. In the time being it depends on the experience of the workers which leads to an apparent differences in the valuation.

The idea of this research came to put the standards to evaluate the quality of the projects in a special system depending on quantity scale nor quality specifying in order to prepare an expert system “ Crystal “ to apply this special system to able the engineers to valuate the quality of their projects easily and in more accurate ways.

View Publication Preview PDF
Publication Date
Thu Mar 01 2007
Journal Name
Al-khwarizmi Engineering Journal
Image restoration using regularized inverse filtering and adaptive threshold wavelet denoising
...Show More Authors

Although the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .

                In this paper  a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering  and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet

... Show More
View Publication Preview PDF