Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless compression scheme of first stage that corresponding to second stage. The tested results shown are promising in both two stages, that implicilty enhanced the performance of traditional polynomial model in terms of compression ratio , and preresving image quality.
In this paper, we proved coincidence points theorems for two pairs mappings which are defined on nonempty subset in metric spaces by using condition (1.1). As application, we established a unique common fixed points theorems for these mappings by using the concept weakly compatible (R-weakly commuting) between these mappings.
In this research, the removal of cadmium (Cd) from simulated wastewater was investigated by using a fixed bed bio-electrochemical reactor. The effects of the main controlling factors on the performance of the removal process such as applied cell voltage, initial Cd concentration, pH of the catholyte, and the mesh number of the cathode were investigated. The results showed that the applied cell voltage had the main impact on the removal efficiency of cadmium where increasing the applied voltage led to higher removal efficiency. Meanwhile increasing the applied voltage was found to be given lower current efficiency and higher energy consumption. No significant effect of initial Cd concentration on the removal efficiency of cadmium b
... Show MoreIn this paper the oscillation criterion was investigated for all solutions of the third-order half linear neutral differential equations. Some necessary and sufficient conditions are established for every solution of (a(t)[(x(t)±p(t)x(?(t) ) )^'' ]^? )^'+q(t) x^? (?(t) )=0, t?t_0, to be oscillatory. Examples are given to illustrate our main results.
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreAudio classification is the process to classify different audio types according to contents. It is implemented in a large variety of real world problems, all classification applications allowed the target subjects to be viewed as a specific type of audio and hence, there is a variety in the audio types and every type has to be treatedcarefully according to its significant properties.Feature extraction is an important process for audio classification. This workintroduces several sets of features according to the type, two types of audio (datasets) were studied. Two different features sets are proposed: (i) firstorder gradient feature vector, and (ii) Local roughness feature vector, the experimentsshowed that the results are competitive to
... Show Morel
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show More
        