The main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each compression process the number of neurons in the hidden layer was changing and calculating the compression ratio, mean square error and peak signal-to-noise ratio to compare the results to get the value of original image. The findings of the research was the desired results as the compression ratio was less than five and a few mean square error thus a large value of peak signal-to-noise ratio had been recorded.
People’s ability to quickly convey their thoughts, or opinions, on various services or items has improved as Web 2.0 has evolved. This is to look at the public perceptions expressed in the reviews. Aspect-based sentiment analysis (ABSA) deemed to receive a set of texts (e.g., product reviews or online reviews) and identify the opinion-target (aspect) within each review. Contemporary aspect-based sentiment analysis systems, like the aspect categorization, rely predominantly on lexicon-based, or manually labelled seeds that is being incorporated into the topic models. And using either handcrafted rules or pre-labelled clues for performing implicit aspect detection. These constraints are restricted to a particular domain or language which is
... Show MoreThis paper aimed to test random walking through the ISX60 market index for the ability to judge market efficiency at a weak level. The study used Serial Correlation Test, the Runs Test, the Variance Ratio Test, as well as the Rescaled Range Test.The population of the study represents of Iraq Stock Exchange. The study concluded accepting the hypothesis of the study that the returns of the ISX60 market index in the Iraqi market for securities does not follow the random walking in general and as a result the Iraq market for securities is inefficient within the weak level of efficiency and the study recommended need a supervisors work in the Iraqi market for securities to activate all means a which will work to communication with information
... Show More
Ground Penetrating Radar (GPR) is a nondestructive geophysical technique that uses electromagnetic waves to evaluate subsurface information. A GPR unit emits a short pulse of electromagnetic energy and is able to determine the presence or absence of a target by examining the reflected energy from that pulse. GPR is geophysical approach that use band of the radio spectrum. In this research the function of GPR has been summarized as survey different buried objects such as (Iron, Plastic(PVC), Aluminum) in specified depth about (0.5m) using antenna of 250 MHZ, the response of the each object can be recognized as its shapes, this recognition have been performed using image processi |
In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreHiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
This study was done in Baghdad teaching Hospital by using developed instrument type GIOHO and included a number of patients with compressed breast thickness (7,8,9,10)cm .
The relationship between radiation dose and breast thickness was linear. All results were compared with the international standered values that measured by the International Nuctear Agency and Europeon sources ,it was found that it is in consistance or has a little difference .
The study showed that the mean absorbed dose may be determined by using TLD measurement below 10 mGy and the glandular dose was (1.45 mGy) and this can not b
... Show MoreZernike Moments has been popularly used in many shape-based image retrieval studies due to its powerful shape representation. However its strength and weaknesses have not been clearly highlighted in the previous studies. Thus, its powerful shape representation could not be fully utilized. In this paper, a method to fully capture the shape representation properties of Zernike Moments is implemented and tested on a single object for binary and grey level images. The proposed method works by determining the boundary of the shape object and then resizing the object shape to the boundary of the image. Three case studies were made. Case 1 is the Zernike Moments implementation on the original shape object image. In Case 2, the centroid of the s
... Show MoreThe aim of the research is to study the comparison between (ARIMA) Auto Regressive Integrated Moving Average and(ANNs) Artificial Neural Networks models and to select the best one for prediction the monthly relative humidity values depending upon the standard errors between estimated and observe values . It has been noted that both can be used for estimation and the best on among is (ANNs) as the values (MAE,RMSE, R2) is )0.036816,0.0466,0.91) respectively for the best formula for model (ARIMA) (6,0,2)(6,0,1) whereas the values of estimates relative to model (ANNs) for the best formula (5,5,1) is (0.0109, 0.0139 ,0.991) respectively. so that model (ANNs) is superior than (ARIMA) in a such evaluation.
ABSTRUCT
In This Paper, some semi- parametric spatial models were estimated, these models are, the semi – parametric spatial error model (SPSEM), which suffer from the problem of spatial errors dependence, and the semi – parametric spatial auto regressive model (SPSAR). Where the method of maximum likelihood was used in estimating the parameter of spatial error ( λ ) in the model (SPSEM), estimated the parameter of spatial dependence ( ρ ) in the model ( SPSAR ), and using the non-parametric method in estimating the smoothing function m(x) for these two models, these non-parametric methods are; the local linear estimator (LLE) which require finding the smoo
... Show More