In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
A nonlinear filter for smoothing color and gray images
corrupted by Gaussian noise is presented in this paper. The proposed
filter designed to reduce the noise in the R,G, and B bands of the
color images and preserving the edges. This filter applied in order to
prepare images for further processing such as edge detection and
image segmentation.
The results of computer simulations show that the proposed
filter gave satisfactory results when compared with the results of
conventional filters such as Gaussian low pass filter and median filter
by using Cross Correlation Coefficient (ccc) criteria.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIt is well known that sonography is not the first choice in detecting early breast tumors. Improving the resolution of breast sonographic image is the goal of many workers to make sonography a first choice examination as it is safe and easy procedure as well as cost effective. In this study, infrared light exposure of breast prior to ultrasound examination was implemented to see its effect on resolution of sonographic image. Results showed that significant improvement was obtained in 60% of cases.
The present research deals with the study of the symmetries of the design of interior spaces in fast food restaurants in terms of formality as it is an important element and plays a direct role in the spatial configuration, which is designed in both of its performance, aesthetic and expressive aspects. Since the choice of shapes is a complex subject that has many aspects imposed by functional and aesthetic correlations, the problem of the research is represented by the following question: (To what extent can the symmetries of the interior design be used in the spaces of fast food restaurants?)
The research acquires its importance by contributing to the addition of knowledge to researchers, scholars, companies and the specialized publ
In many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreThe aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN
Analysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models
... Show More