In this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, thebi-orthogonal wavelet transform is applied on the produced Bezier residue component. The resulting transform coefficients are quantized using progressive scalar quantization and the 1 st order polynomial is applied on the quantized LL subband to produce the polynomial surface, then the produced polynomial surface is subtracted from the LL subband to get the residue component (high frequency component). Then, the quantized values are represented using quad tree encoding to prune the sparse blocks, followed by high order shift coding algorithm to handle the remaining statistical redundancy and to attain efficient compression performance. The conducted tests indicated that the introduced system leads to promising compression gain.
This study aimed to explore the manufacture of high-fat pellets for obesity induction diets in male Wistar rats and determined its effect on lipid profiles and body mass index. It was an experimental laboratory method with a post-test randomized control group. Formulation of high-fat pellets (HFD) and physico-chemical characteristics of pellets were conducted in September 2019. This study used about 28 male Wistar white rats, two months old, and 150-200 g body weight. Rats were acclimatized for seven days, then divided into four groups: 7 rats were given a standard feed of Confeed PARS CP594 (P0), and three groups (P1, P2, P3) were given high-fat feed (HFD FII) 30 g/head/day. The result showed that the mean fat content of Formula II pell
... Show MoreThis study aims to enhance the RC5 algorithm to improve encryption and decryption speeds in devices with limited power and memory resources. These resource-constrained applications, which range in size from wearables and smart cards to microscopic sensors, frequently function in settings where traditional cryptographic techniques because of their high computational overhead and memory requirements are impracticable. The Enhanced RC5 (ERC5) algorithm integrates the PKCS#7 padding method to effectively adapt to various data sizes. Empirical investigation reveals significant improvements in encryption speed with ERC5, ranging from 50.90% to 64.18% for audio files and 46.97% to 56.84% for image files, depending on file size. A substanti
... Show MoreGrabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
Audio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show MoreIn this research we will present the signature as a key to the biometric authentication technique. I shall use moment invariants as a tool to make a decision about any signature which is belonging to the certain person or not. Eighteen voluntaries give 108 signatures as a sample to test the proposed system, six samples belong to each person were taken. Moment invariants are used to build a feature vector stored in this system. Euclidean distance measure used to compute the distance between the specific signatures of persons saved in this system and with new sample acquired to same persons for making decision about the new signature. Each signature is acquired by scanner in jpg format with 300DPI. Matlab used to implement this system.
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreOscillation criteria are obtained for all solutions of the first-order linear delay differential equations with positive and negative coefficients where we established some sufficient conditions so that every solution of (1.1) oscillate. This paper generalized the results in [11]. Some examples are considered to illustrate our main results.