Compressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous related works to test the effectiveness of the implemented coding schema.
Power-electronic converters are essential elements for the effective interconnection of renewable energy sources to the power grid, as well as to include energy storage units, vehicle charging stations, microgrids, etc. Converter models that provide an accurate representation of their wideband operation and interconnection with other active and passive grid components and systems are necessary for reliable steady state and transient analyses during normal or abnormal grid operating conditions. This paper introduces two Laplace domain-based approaches to model buck and boost DC-DC converters for electromagnetic transient studies. The first approach is an analytical one, where the converter is represented by a two-port admittance model via mo
... Show MoreThis paper introduces the Multistep Modified Reduced Differential Transform Method (MMRDTM). It is applied to approximate the solution for Nonlinear Schrodinger Equations (NLSEs) of power law nonlinearity. The proposed method has some advantages. An analytical approximation can be generated in a fast converging series by applying the proposed approach. On top of that, the number of computed terms is also significantly reduced. Compared to the RDTM, the nonlinear term in this method is replaced by related Adomian polynomials prior to the implementation of a multistep approach. As a consequence, only a smaller number of NLSE computed terms are required in the attained approximation. Moreover, the approximation also converges rapidly over a
... Show MoreThe speaker identification is one of the fundamental problems in speech processing and voice modeling. The speaker identification applications include authentication in critical security systems and the accuracy of the selection. Large-scale voice recognition applications are a major challenge. Quick search in the speaker database requires fast, modern techniques and relies on artificial intelligence to achieve the desired results from the system. Many efforts are made to achieve this through the establishment of variable-based systems and the development of new methodologies for speaker identification. Speaker identification is the process of recognizing who is speaking using the characteristics extracted from the speech's waves like pi
... Show MoreToday’s modern medical imaging research faces the challenge of detecting brain tumor through Magnetic Resonance Images (MRI). Normally, to produce images of soft tissue of human body, MRI images are used by experts. It is used for analysis of human organs to replace surgery. For brain tumor detection, image segmentation is required. For this purpose, the brain is partitioned into two distinct regions. This is considered to be one of the most important but difficult part of the process of detecting brain tumor. Hence, it is highly necessary that segmentation of the MRI images must be done accurately before asking the computer to do the exact diagnosis. Earlier, a variety of algorithms were developed for segmentation of MRI images by usin
... Show MorePorosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in r
... Show MoreDeepfake is a type of artificial intelligence used to create convincing images, audio, and video hoaxes and it concerns celebrities and everyone because they are easy to manufacture. Deepfake are hard to recognize by people and current approaches, especially high-quality ones. As a defense against Deepfake techniques, various methods to detect Deepfake in images have been suggested. Most of them had limitations, like only working with one face in an image. The face has to be facing forward, with both eyes and the mouth open, depending on what part of the face they worked on. Other than that, a few focus on the impact of pre-processing steps on the detection accuracy of the models. This paper introduces a framework design focused on this asp
... Show MoreOptical Mark Recognition (OMR) is an important technology for applications that require speedy, high-accuracy processing of a huge volume of hand-filled forms. The aim of this technology is to reduce manual work, human effort, high accuracy in assessment, and minimize time for evaluation answer sheets. This paper proposed OMR by using Modify Bidirectional Associative Memory (MBAM), MBAM has two phases (learning and analysis phases), it will learn on the answer sheets that contain the correct answers by giving its own code that represents the number of correct answers, then detection marks from answer sheets by using analysis phase. This proposal will be able to detect no selection or select more than one choice, in addition, using M
... Show MorePrediction of accurate values of residual entropy (SR) is necessary step for the
calculation of the entropy. In this paper, different equations of state were tested for the
available 2791 experimental data points of 20 pure superheated vapor compounds (14
pure nonpolar compounds + 6 pure polar compounds). The Average Absolute
Deviation (AAD) for SR of 2791 experimental data points of the all 20 pure
compounds (nonpolar and polar) when using equations of Lee-Kesler, Peng-
Robinson, Virial truncated to second and to third terms, and Soave-Redlich-Kwong
were 4.0591, 4.5849, 4.9686, 5.0350, and 4.3084 J/mol.K respectively. It was found
from these results that the Lee-Kesler equation was the best (more accurate) one
In this study, we present a new steganography method depend on quantizing the perceptual color spaces bands. Four perceptual color spaces are used to test the new method which is HSL, HSV, Lab and Luv, where different algorithms to calculate the last two-color spaces are used. The results reveal the validity of this method as a steganoic method and analysis for the effects of quantization and stegano process on the quality of the cover image and the quality of the perceptual color spaces bands are presented.
In information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show More