The main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each compression process the number of neurons in the hidden layer was changing and calculating the compression ratio, mean square error and peak signal-to-noise ratio to compare the results to get the value of original image. The findings of the research was the desired results as the compression ratio was less than five and a few mean square error thus a large value of peak signal-to-noise ratio had been recorded.
Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
A robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video str
... Show MoreEnsuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capac
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreIn this paper, some estimators of the unknown shape parameter and reliability function of Basic Gompertz distribution (BGD) have been obtained, such as MLE, UMVUE, and MINMSE, in addition to estimating Bayesian estimators under Scale invariant squared error loss function assuming informative prior represented by Gamma distribution and non-informative prior by using Jefferys prior. Using Monte Carlo simulation method, these estimators of the shape parameter and R(t), have been compared based on mean squared errors and integrated mean squared, respectively
Praise to Allah, Lord of the Worlds. Thank you very much. Blessed. As his face should be majestic and great. His authority, and may peace and blessings be upon our master Muhammad, a perpetual blessing until the Day of Judgment
And upon the God of purity, His righteous companions, and those who follow them in righteousness until the Day of Judgment. But after:-
Anyone who looks into the history of nations, peoples, and the conditions of human beings will see that naturalization as a person’s affiliation to a particular state is something that happened only in recent centuries. In ancient times, a person’s loyalty was to the tribe to which the person belonged, and he was integrated into it and attributed to it, and in
... Show MoreThe haplotype association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease.Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls.It starts with inferring haplotypes from genotypes followed by a haplotype co-classification and marginal screening for disease-associated haplotypes.Unfortunately,phasing uncertainty may have a strong effects on the haplotype co-classification and therefore on the accuracy of predicting risk haplotypes.Here,to address the issue,we propose an alternative approach:In Stage 1,we select potential risk genotypes inste
... Show MoreAbstract
The current research aims to construct a scale for the nine types of students’ personality according to Rob Fitzel model. To do this, (162) items were formed that present the nine types of personality with (18) items for each type. To test the validity of the scale, a sample of (584) students of Al-Mustansrya University were chosen. The data of their responses was analyzed by using factor analysis. The findings explored (9) factors as one factor for each type of personality with (12) items for each one. Then, the reliability of the scale was found by using the test-retest method and Alfa Cronbach method.
L-arabinose isomerase from Escherichia coli O157:H7 Was immobilized with activated Bentonite from local markets of Baghdad, Iraq by 10% 3-APTES and treated with 10% aqueous glutaraldehyde, the results refer that the yield of immobilization was 89%, and pH profile of free and immobilized L-arabinose isomerase was 7 and 7.5 and it is stable at 6-8 for 60 min respectively, while, the optimum temperature was 30 and 35°C and it was stable at 35 and 40°C for 60 min but it loses more than 60 and 30% from its original activity at 50°C for free and immobilized L-arabinose isomerase respectively. Immobilized enzyme retained its full activity for 32 day, but it retained 73.58% of its original activity after storage for 60 d
... Show More