In this study, an analysis of re-using the JPEG lossy algorithm on the quality of satellite imagery is presented. The standard JPEG compression algorithm is adopted and applied using Irfan view program, the rang of JPEG quality that used is 50-100.Depending on the calculated satellite image quality variation, the maximum number of the re-use of the JPEG lossy algorithm adopted in this study is 50 times. The image quality degradation to the JPEG quality factor and the number of re-use of the JPEG algorithm to store the satellite image is analyzed.
FG Mohammed, HM Al-Dabbas, Iraqi journal of science, 2018 - Cited by 6
Nonlinear time series analysis is one of the most complex problems ; especially the nonlinear autoregressive with exogenous variable (NARX) .Then ; the problem of model identification and the correct orders determination considered the most important problem in the analysis of time series . In this paper , we proposed splines estimation method for model identification , then we used three criterions for the correct orders determination. Where ; proposed method used to estimate the additive splines for model identification , And the rank determination depends on the additive property to avoid the problem of curse dimensionally . The proposed method is one of the nonparametric methods , and the simulation results give a
... Show MoreIn recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show More. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIn this research a proposed technique is used to enhance the frame difference technique performance for extracting moving objects in video file. One of the most effective factors in performance dropping is noise existence, which may cause incorrect moving objects identification. Therefore it was necessary to find a way to diminish this noise effect. Traditional Average and Median spatial filters can be used to handle such situations. But here in this work the focus is on utilizing spectral domain through using Fourier and Wavelet transformations in order to decrease this noise effect. Experiments and statistical features (Entropy, Standard deviation) proved that these transformations can stand to overcome such problems in an elegant way.
... Show MoreRecently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform