This paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate injury, this difference may be tooth decay or a broken bone , cancerous cells or infected gums, The X-OR gate is the best gates uses in this technique. Simulation program using visual basic has been in order to determine final results.
Leishmania species are the causative agent of a tropical disease known as leishmaniasis. Previous studies on the old world species Leishmania major, showed that the amastigotes form which resides inside the macrophage of the vertebrate host, utilize host’s sphingolipids for survival and proliferation. In this study, gene expression of serine palmitoyltransferase (SPT) subunit two (MmLCB2) of the mouse macrophage cell line (RAW264.7), which is the first enzyme in the de novo sphingolipid biosynthesis, was detected in both infected and non-infected macrophages. This was detected under condition where available sphingolipid was reduced, with the new world species Leishmania mexicana. Results of qPCR analysis showed that there was no differen
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreBackground and Aim: due to the rapid growth of data communication and multimedia system applications, security becomes a critical issue in the communication and storage of images. This study aims to improve encryption and decryption for various types of images by decreasing time consumption and strengthening security. Methodology: An algorithm is proposed for encrypting images based on the Carlisle Adams and Stafford Tavares CAST block cipher algorithm with 3D and 2D logistic maps. A chaotic function that increases the randomness in the encrypted data and images, thereby breaking the relation sequence through the encryption procedure, is introduced. The time is decreased by using three secure and private S-Boxes rather than using si
... Show MoreBuilding a system to identify individuals through their speech recording can find its application in diverse areas, such as telephone shopping, voice mail and security control. However, building such systems is a tricky task because of the vast range of differences in the human voice. Thus, selecting strong features becomes very crucial for the recognition system. Therefore, a speaker recognition system based on new spin-image descriptors (SISR) is proposed in this paper. In the proposed system, circular windows (spins) are extracted from the frequency domain of the spectrogram image of the sound, and then a run length matrix is built for each spin, to work as a base for feature extraction tasks. Five different descriptors are generated fro
... Show MoreWe explore the transform coefficients of fractal and exploit new method to improve the compression capabilities of these schemes. In most of the standard encoder/ decoder systems the quantization/ de-quantization managed as a separate step, here we introduce new way (method) to work (managed) simultaneously. Additional compression is achieved by this method with high image quality as you will see later.