In many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally that the speed up of the proposed method compared with traditional approaches approximately reaches up to 20 times depending on the block parameters.
This research aims to improve the radiation shielding properties of polymer-based materials by mixing PVC with locally available building materials. Specifically, two key parameters of fast neutron attenuation (removal cross-section and half-value layer) were studied for composite materials comprising PVC reinforced with common building materials (cement, sand, gypsum and marble) in different proportions (10%, 30% and 50% by weight). To assess their effectiveness as protection against fast neutrons, the macroscopic neutron cross-section was calculated for each composite. Results show that neutron cross-section values are significantly affected by the reinforcement ratios, and that the composite material PVC + 50% gypsum is an effect
... Show MoreThis paper demonstrates the design of an algorithm to represent the design stages of fixturing system that serve in increasing the flexibility and automation of fixturing system planning for uniform polyhedral part. This system requires building a manufacturing feature recognition algorithm to present or describe inputs such as (configuration of workpiece) and built database system to represents (production plan and fixturing system exiting) to this algorithm. Also knowledge – base system was building or developed to find the best fixturing analysis (workpiece setup, constraints of workpiece and arrangement the contact on this workpiece) to workpiece.
Traffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational c
... Show MoreA frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThe usage of remote sensing techniques in managing and monitoring the environmental areas is increasing due to the improvement of the sensors used in the observation satellites around the earth. Resolution merge process is used to combine high resolution one band image with another one that have low resolution multi bands image to produce one image that is high in both spatial and spectral resolution. In this work different merging methods were tested to evaluate their enhancement capabilities to extract different environmental areas; Principle component analysis (PCA), Brovey, modified (Intensity, Hue ,Saturation) method and High Pass Filter methods were tested and subjected to visual and statistical comparison for evaluation. Both visu
... Show More