This search has introduced the techniques of multi-wavelet transform and neural network for recognition 3-D object from 2-D image using patches. The proposed techniques were tested on database of different patches features and the high energy subband of discrete multi-wavelet transform DMWT (gp) of the patches. The test set has two groups, group (1) which contains images, their (gp) patches and patches features of the same images as a part of that in the data set beside other images, (gp) patches and features, and group (2) which contains the (gp) patches and patches features the same as a part of that in the database but after modification such as rotation, scaling and translation. Recognition by back propagation (BP) neural network as compared with matching by minimum distance, gave (94%) and (83%) score by using group (1), (gp) and features respectively, which is much better than the minimum distance. Recognition using (gp) neural network (NN) gave a (94%) and (72%) score by using group (2), (gp) and features respectively, while the minimum distance gave (11%) and (33%) scores. Time consumption
through the recognition process using (NN) with (gp) is less than that minimum distance.
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreBusiness organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreThis article proposes a new technique for determining the rate of contamination. First, a generative adversarial neural network (ANN) parallel processing technique is constructed and trained using real and secret images. Then, after the model is stabilized, the real image is passed to the generator. Finally, the generator creates an image that is visually similar to the secret image, thus achieving the same effect as the secret image transmission. Experimental results show that this technique has a good effect on the security of secret information transmission and increases the capacity of information hiding. The metric signal of noise, a structural similarity index measure, was used to determine the success of colour image-hiding t
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreImage compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.
Permeability determination in Carbonate reservoir is a complex problem, due to their capability to be tight and heterogeneous, also core samples are usually only available for few wells therefore predicting permeability with low cost and reliable accuracy is an important issue, for this reason permeability predictive models become very desirable.
This paper will try to develop the permeability predictive model for one of Iraqi carbonate reservoir from core and well log data using the principle of Hydraulic Flow Units (HFUs). HFU is a function of Flow Zone Indicator (FZI) which is a good parameter to determine (HFUs).
Histogram analysis, probability analysis and Log-Log plot of Reservoir Qua
... Show More