Compressing an image and reconstructing it without degrading its original quality is one of the challenges that still exist now a day. A coding system that considers both quality and compression rate is implemented in this work. The implemented system applies a high synthetic entropy coding schema to store the compressed image at the smallest size as possible without affecting its original quality. This coding schema is applied with two transform-based techniques, one with Discrete Cosine Transform and the other with Discrete Wavelet Transform. The implemented system was tested with different standard color images and the obtained results with different evaluation metrics have been shown. A comparison was made with some previous related works to test the effectiveness of the implemented coding schema.
The huge evolving in the information technologies, especially in the few last decades, has produced an increase in the volume of data on the World Wide Web, which is still growing significantly. Retrieving the relevant information on the Internet or any data source with a query created by a few words has become a big challenge. To override this, query expansion (QE) has an important function in improving the information retrieval (IR), where the original query of user is recreated to a new query by appending new related terms with the same importance. One of the problems of query expansion is the choosing of suitable terms. This problem leads to another challenge of how to retrieve the important documents with high precision, high recall
... Show MoreThe Internet of Things (IoT) is a network of devices used for interconnection and data transfer. There is a dramatic increase in IoT attacks due to the lack of security mechanisms. The security mechanisms can be enhanced through the analysis and classification of these attacks. The multi-class classification of IoT botnet attacks (IBA) applied here uses a high-dimensional data set. The high-dimensional data set is a challenge in the classification process due to the requirements of a high number of computational resources. Dimensionality reduction (DR) discards irrelevant information while retaining the imperative bits from this high-dimensional data set. The DR technique proposed here is a classifier-based fe
... Show MoreThis paper is devoted to compare the performance of non-Bayesian estimators represented by the Maximum likelihood estimator of the scale parameter and reliability function of inverse Rayleigh distribution with Bayesian estimators obtained under two types of loss function specifically; the linear, exponential (LINEX) loss function and Entropy loss function, taking into consideration the informative and non-informative priors. The performance of such estimators assessed on the basis of mean square error (MSE) criterion. The Monte Carlo simulation experiments are conducted in order to obtain the required results.
During the last few decades, many academic and professional groups gave attention to adopting the multi-criteria decision-making methods in a variety of contexts for decision-making that are given to the diversity and sophistication of their selections. Five different classification methods are tested and assessed in this paper. Each has its own set of five attribute selection approaches. By using the multi-criteria decision-making procedures, these data can be used to rate options. Technique for order of preference by similarity to ideal solution (TOPSIS) is designed utilizing a modified fuzzy analytic hierarchy process (MFAHP) to compute the weight alternatives for TOPSIS in order to obtain the confidence value of each class
... Show MoreCloud computing offers a new way of service provision by rearranging various resources over the Internet. The most important and popular cloud service is data storage. In order to preserve the privacy of data holders, data are often stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for big data storage and processing in the cloud. Traditional deduplication schemes cannot work on encrypted data. Among these data, digital videos are fairly huge in terms of storage cost and size; and techniques that can help the legal aspects of video owner such as copyright protection and reducing the cloud storage cost and size are always desired. This paper focuses on v
... Show MoreImproving performance is an important issue in Wireless Sensor Networks (WSN). WSN has many limitations including network performance. The research question is how to reduce the amount of data transmitted to improve network performance?
The work will include one of the dictionary compression methods which is Lempel Ziv Welch(LZW). One problem with the dictionary method is that the token size is fixed. The LZW dictionary method is not very useful with little data, because it loses many byt
... Show MorePoly methyl methacrylate PMMA polymer has been used continually in dental application during the recent years. Yet, it is commonly known for its poor strength properties for long periods under pressure. The aim of this research was to improve the performance of PMMA denture base through the addition of different nanoparticles selected from artificial and natural sources. For comparison, Nano -particles from Al2O3 and crushed pistachio shell were utilised. (1%, 2% and 3%) were the weight fraction used in this study for both reinforcement types. In this work, a study and evaluation in of Compression Strength (C.S.) as well as Young’s Modulus (Y) was done before and after exposure for special liquids. The new prepared composites were immerse
... Show MoreIn the last decade, 3D models gained interest in many applications, such as games, the medical field, and manufacture. It is necessary to protect these models from unauthorized copying, distribution, and editing. Digital watermarking is the best way to solve this problem. This paper introduces a robust watermarking method by embedding the watermark in the low-frequency domain, then selecting the coarsest level for embedding the watermark based on the strength factor. The invisibility of the watermark for the proposed algorithm is tested by using different measurements, such as HD and PSNR. The robustness was tested by using different types of attacks; the correlation coefficient was applied for the evaluati
... Show MoreThe paper shows how to estimate the three parameters of the generalized exponential Rayleigh distribution by utilizing the three estimation methods, namely, the moment employing estimation method (MEM), ordinary least squares estimation method (OLSEM), and maximum entropy estimation method (MEEM). The simulation technique is used for all these estimation methods to find the parameters for the generalized exponential Rayleigh distribution. In order to find the best method, we use the mean squares error criterion. Finally, in order to extract the experimental results, one of object oriented programming languages visual basic. net was used
This study aimed to extraction of essential oil from peppermint leaves by using hydro distillation methods. In the peppermint oil extraction with hydro distillation method is studied the effect of the extraction temperature to the yield of peppermint oil. Besides it also studied the kinetics during the extraction process. Then, 2nd -order mechanism was adopted in the model of hydro distillation for estimation many parameters such as the initial extraction rate, capacity of extraction and the constant rat of extraction with various temperature. The same model was also used to estimate the activation energy. The results showed a spontaneous process, since the Gibbs free energy had a value negative sign.