Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eyes' observation of the different colors and features of images. We propose a multi-layer hybrid system for deep learning using the unsupervised CAE architecture and using the color clustering of the K-mean algorithm to compress images and determine their size and color intensity. The system is implemented using Kodak and Challenge on Learned Image Compression (CLIC) dataset for deep learning. Experimental results show that our proposed method is superior to the traditional compression methods of the autoencoder, and the proposed work has better performance in terms of performance speed and quality measures Peak Signal To Noise Ratio (PSNR) and Structural Similarity Index (SSIM) where the results achieved better performance and high efficiency With high compression bit rates and low Mean Squared Error (MSE) rate the results recorded the highest compression ratios that ranged between (0.7117 to 0.8707) for the Kodak dataset and (0.7191 to 0.9930) for CLIC dataset. The system achieved high accuracy and quality in comparison to the error coefficient, which was recorded (0.0126 to reach 0.0003) below, and this system is onsidered the most quality and accurate compared to the methods of deep learning compared to the deep learning methods of the autoencoder
This study was focused on biotreatment of soil which polluted by petroleum compounds (Diesel) which caused serious environmental problems. One of the most effective and promising ways to treat diesel-contaminated soil is bioremediation. It is a choice that offers the potential to destroy harmful pollutants using biological activity. Four bacterial strains were isolated from diesel contaminated soil samples. The isolates were identified by the Vitek 2 system, as Sphingomonas paucimobilis, Pentoae species, Staphylococcus aureus, and Enterobacter cloacae. The potential of biological surfactant production was tested using the Sigma 703D stand-alone tensiometer showed that these isolates are biological surfactant producers. The bet
... Show MoreThe Hopfield network is one of the easiest types, and its architecture is such that each neuron in the network connects to the other, thus called a fully connected neural network. In addition, this type is considered auto-associative memory, because the network returns the pattern immediately upon recognition, this network has many limitations, including memory capacity, discrepancy, orthogonally between patterns, weight symmetry, and local minimum. This paper proposes a new strategy for designing Hopfield based on XOR operation; A new strategy is proposed to solve these limitations by suggesting a new algorithm in the Hopfield network design, this strategy will increase the performance of Hopfield by modifying the architecture of t
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
In the absence of environmental regulation, food stays to be contaminated with heavy metals, which is becoming a big worry for human health. The present research focusses on the environmental and health effects of irrigating a number of crops grown in the soils surrounding the Al-Rustamia old plant using treated wastewater generated by the plant. The physicochemical properties, alkalinity, and electrical conductivity of the samples were evaluated, and vegetable samples were tested for Cd, Pb, Ni, and Zn, levels, and even the transfer factor (TF) from soils to crops and crop and multi-targeted risk, daily intake (DIM) of metals, and health risk index (HRI) was calculated. The findings found that the average contents of Zn, Pb, Ni, an
... Show MoreThis study was focused on biotreatment of soil which polluted by petroleum compounds (Diesel) which caused serious environmental problems. One of the most effective and promising ways to treat diesel-contaminated soil is bioremediation. It is a choice that offers the potential to destroy harmful pollutants using biological activity.
Four bacterial strains were isolated from diesel contaminated soil samples. The isolates were identified by the Vitek 2 system, as Sphingomonas paucimobilis, Pentoae species, Staphylococcus aureus, and Enterobacter cloacae. The potential of biological surfactant production was tested using the Sigma 703D stand-alone tensiometer showed
... Show MoreTime series have gained great importance and have been applied in a manner in the economic, financial, health and social fields and used in the analysis through studying the changes and forecasting the future of the phenomenon. One of the most important models of the black box is the "ARMAX" model, which is a mixed model consisting of self-regression with moving averages with external inputs. It consists of several stages, namely determining the rank of the model and the process of estimating the parameters of the model and then the prediction process to know the amount of compensation granted to workers in the future in order to fulfil the future obligations of the Fund. , And using the regular least squares method and the frequ
... Show MoreSemi-parametric regression models have been studied in a variety of applications and scientific fields due to their high flexibility in dealing with data that has problems, as they are characterized by the ease of interpretation of the parameter part while retaining the flexibility of the non-parametric part. The response variable or explanatory variables can have outliers, and the OLS approach have the sensitivity to outliers. To address this issue, robust (resistance) methods were used, which are less sensitive in the presence of outlier values in the data. This study aims to estimate the partial regression model using the robust estimation method with the wavel
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show More