Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eyes' observation of the different colors and features of images. We propose a multi-layer hybrid system for deep learning using the unsupervised CAE architecture and using the color clustering of the K-mean algorithm to compress images and determine their size and color intensity. The system is implemented using Kodak and Challenge on Learned Image Compression (CLIC) dataset for deep learning. Experimental results show that our proposed method is superior to the traditional compression methods of the autoencoder, and the proposed work has better performance in terms of performance speed and quality measures Peak Signal To Noise Ratio (PSNR) and Structural Similarity Index (SSIM) where the results achieved better performance and high efficiency With high compression bit rates and low Mean Squared Error (MSE) rate the results recorded the highest compression ratios that ranged between (0.7117 to 0.8707) for the Kodak dataset and (0.7191 to 0.9930) for CLIC dataset. The system achieved high accuracy and quality in comparison to the error coefficient, which was recorded (0.0126 to reach 0.0003) below, and this system is onsidered the most quality and accurate compared to the methods of deep learning compared to the deep learning methods of the autoencoder
This Paper aims to plan the production of the electrical distribution converter (400 KV/11) for one month at Diyala Public Company and with more than one goal for the decision-maker in a fuzzy environment. The fuzzy demand was forecasting using the fuzzy time series model. The fuzzy lead time for raw materials involved in the production of the electrical distribution converter (400 KV/11) was addressed using the fuzzy inference matrix through the application of the matrix in Matlab, and since the decision-maker has more than one goal, so a mathematical model of goal programming was create, which aims to achieve two goals, the first is to reduce the total production costs of the electrical distribution converter (400 KV/11) and th
... Show MoreDesign and build a center basins new p-type four mirrors were studied its effect on all parameters evaluating the performance of the solar cell silicon in the absence of a cooling system is switched on and noted that the efficiency of the performance Hzzh cell increased from 11.94 to 21 without cooling either with cooling has increased the efficiency of the
The purpose of this research work is to synthesize conjugates of some NSAIDs with sulfamethoxazole as possible mutual prodrugs to overcome the local gastric irritation of NSAID with free carboxyl group by formation of ester linkage that supposed to remain intact in stomach and may hydrolyze in intestine chemically or enzymatically; in addition to that attempting to target the synthesized derivative to the colon by formation of azo group that undergo reduction only by colonic bacterial azo reductaze enzyme to liberate the parent compound to act locally (treatment of inflammation and infections in colon)
The current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances. From the diversity of Big Data variables comes many challenges that can be interesting to the researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter
... Show MoreCombining different treatment strategies successively or simultaneously has become recommended to achieve high purification standards for the treated discharged water. The current work focused on combining electrocoagulation, ion-exchange, and ultrasonication treatment approaches for the simultaneous removal of copper, nickel, and zinc ions from water. The removal of the three studied ions was significantly enhanced by increasing the power density (4–10 mA/cm2) and NaCl salt concentration (0.5–1.5 g/L) at a natural solution pH. The simultaneous removal of these metal ions at 4 mA/cm2 and 1 g NaCl/L was highly improved by introducing 1 g/L of mordenite zeolite as an ion-exchanger. A remarkable removal of heavy metals was reported
... Show MoreHeavy metals contamination in aquatic ecosystems is considered one of the most important threats of aquatic life. Submerge aquatic plants Ceratophyllum demersum in its non living form used for the removal of trace elements. This article studied the ability of the fine powder of C.demersum for the removal of some heavy metals (HM) like copper, cadmium, lead and chrome from aqueous solution with in variable experimental factors. The study occupy two treatments the first included different hydrogen ions pH within a range of 4, 5,6and 8 with a constant HM concentration (1000 ppm).While the second treatment represented by using variable HM concentrations within a range of (250,500,750and 1000 ppm) with a constant pH=7.In both treatments the a
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreIt has been shown in ionospheric research that calculation of the total electron content (TEC) is an important factor in global navigation system. In this study, TEC calculation was performed over Baghdad city, Iraq, using a combination of two numerical methods called composite Simpson and composite Trapezoidal methods. TEC was calculated using the line integral of the electron density derived from the International reference ionosphere IRI2012 and NeQuick2 models from 70 to 2000 km above the earth surface. The hour of the day and the day number of the year, R12, were chosen as inputs for the calculation techniques to take into account latitudinal, diurnal and seasonal variation of TEC. The results of latitudinal variation of TE
... Show More