Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eyes' observation of the different colors and features of images. We propose a multi-layer hybrid system for deep learning using the unsupervised CAE architecture and using the color clustering of the K-mean algorithm to compress images and determine their size and color intensity. The system is implemented using Kodak and Challenge on Learned Image Compression (CLIC) dataset for deep learning. Experimental results show that our proposed method is superior to the traditional compression methods of the autoencoder, and the proposed work has better performance in terms of performance speed and quality measures Peak Signal To Noise Ratio (PSNR) and Structural Similarity Index (SSIM) where the results achieved better performance and high efficiency With high compression bit rates and low Mean Squared Error (MSE) rate the results recorded the highest compression ratios that ranged between (0.7117 to 0.8707) for the Kodak dataset and (0.7191 to 0.9930) for CLIC dataset. The system achieved high accuracy and quality in comparison to the error coefficient, which was recorded (0.0126 to reach 0.0003) below, and this system is onsidered the most quality and accurate compared to the methods of deep learning compared to the deep learning methods of the autoencoder
This paper seeks to study the link between the fundamentalist evidence based on the observance of governance and interests and the ranks of the three legitimate purposes (necessary, need and detailed). The researcher followed the descriptive-analytical approach. The study reached important results, including that the measurement relates to the three ranks, but predominantly attached to measure the meaning of the need and the need, and the measurement of the semi-formal and semi-predominance improvement. Reclamation is considered by the majority of scholars to be authentic if it is related to the necessity and the need, and that it is not acceptable to improve only by a witness who recommends it. The excuses relate to Hajji and Tahini, no
... Show MoreIn this paper, an approximate solution of nonlinear two points boundary variational problem is presented. Boubaker polynomials have been utilized to reduce these problems into quadratic programming problem. The convergence of this polynomial has been verified; also different numerical examples were given to show the applicability and validity of this method.
This research presents a new study in reactive distillation by adopting a consecutive reaction . The adopted consecutive reaction was the saponification reaction of diethyl adipate with NaOH solution. The saponification reaction occurs in two steps. The distillation process had the role of withdrawing the intermediate product i.e. monoethyl adipate from the reacting mixture before the second conversion to disodium adipate occurred. It was found that monoethyl adipate appeared successfully in the distillate liquid. The percentage conversion from di-ester to monoester was greatly enhanced (reaching 86%) relative to only 15.3% for the case of reaction without distillation .This means 5 times enhancement . The presence of two layers in both the
... Show MoreTexture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
The Hopfield network is one of the easiest types, and its architecture is such that each neuron in the network connects to the other, thus called a fully connected neural network. In addition, this type is considered auto-associative memory, because the network returns the pattern immediately upon recognition, this network has many limitations, including memory capacity, discrepancy, orthogonally between patterns, weight symmetry, and local minimum. This paper proposes a new strategy for designing Hopfield based on XOR operation; A new strategy is proposed to solve these limitations by suggesting a new algorithm in the Hopfield network design, this strategy will increase the performance of Hopfield by modifying the architecture of t
... Show MoreThe microdrilling and nanodrilling holes are produced by a Q-switched Nd :YAG laser (1064 nm) interaction with 8009 Al alloy using nanoparticles. Two kinds of nanoparticles were used with this alloy. These nanoparticles are tungsten carbide (WC) and silica carbide (SiC). In this work, the microholes and nanoholes have been investigated with different laser pulse energies (600, 700 and 800)mJ, different repetition rates (5Hz and 10Hz) and different concentration of nanoparticles (90%, 50% and 5% ). The results indicate that the microholes and nanoholes have been achieved when the laser pulse energy is 600 mJ, laser repetition rate is 5Hz, and the concentration of the nanoparticles (for the two types of n
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
De-waxing of lubricating oil distillate (400-500 ºC) by using urea was investigated in the present study. Lubricating oil distillate produced by vacuum distillation and refined by furfural extraction was taken from Al-Daura refinery. This oil distillate has a pour point of 34 ºC. Two solvents were used to dilute the oil distillate, these are methyl isobutyl ketone and methylene chloride. The operating conditions of the urea adduct formation with n-paraffins in the presence of methyl isobutyl ketone were studied in details, these are solvent to oil volume ratio within the range of 0 to 2, mixer speed 0 to 2000 rpm, urea to wax weight ratio 0 to 6.3, time of adduction 0 to 71 min and temperature 30-70 ºC). Pour point of de-waxed oil and yi
... Show MoreIn the light of the globalization Which surrounds the business environment and whose impact has been reflected on industrial economic units the whole world has become a single market that affects its variables on all units and is affected by the economic contribution of each economic unit as much as its share. The problem of this research is that the use of Pareto analysis enables industrial economic units to diagnose the risks surrounding them , so the main objective of the research was to classify risks into both internal and external types and identify any risks that require more attention.
The research was based on the hypothesis that Pareto analysis used, risks can be identified and addressed before they occur.
... Show MoreSemi-parametric regression models have been studied in a variety of applications and scientific fields due to their high flexibility in dealing with data that has problems, as they are characterized by the ease of interpretation of the parameter part while retaining the flexibility of the non-parametric part. The response variable or explanatory variables can have outliers, and the OLS approach have the sensitivity to outliers. To address this issue, robust (resistance) methods were used, which are less sensitive in the presence of outlier values in the data. This study aims to estimate the partial regression model using the robust estimation method with the wavel
... Show More