This paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one pixel-wide lines. Finally, the Fusion technique was used to merge the results of the Histogram Equalization process with the Skeletonization process to obtain the new high contrast images. The proposed method was tested in different quality images from National Institute of Standard and Technology (NIST) special database 14. The experimental results are very encouraging and the current enhancement method appeared to be effective by improving different quality images.
Background: Radiation therapy has the ability to destroy healthy cells in addition to cancer cells in the area being treated. However, when radiation combines with doxorubicin, it becomes more effective on breast cancer treatment. Objective: This study aims to clarify the effect of X-ray from LINAC combined with amygdalin and doxorubicin on breast cancer treatment, and the possibility of using amygdalin with X-ray instead of doxorubicin for the breast cancer treatment. Method: Two cell lines were used in this study, the first one was MCF-7 cell line and second one was WRL- 68 normal cell line. These cells were preserved in liquid nitrogen, prepared, developed and tested in the (place). The effect of three x-ray doses combined with a
... Show MoreIn this research, the covariance estimates were used to estimate the population mean in the stratified random sampling and combined regression estimates. were compared by employing the robust variance-covariance matrices estimates with combined regression estimates by employing the traditional variance-covariance matrices estimates when estimating the regression parameter, through the two efficiency criteria (RE) and mean squared error (MSE). We found that robust estimates significantly improved the quality of combined regression estimates by reducing the effect of outliers using robust covariance and covariance matrices estimates (MCD, MVE) when estimating the regression parameter. In addition, the results of the simulation study proved
... Show MoreCryptography algorithms play a critical role in information technology against various attacks witnessed in the digital era. Many studies and algorithms are done to achieve security issues for information systems. The high complexity of computational operations characterizes the traditional cryptography algorithms. On the other hand, lightweight algorithms are the way to solve most of the security issues that encounter applying traditional cryptography in constrained devices. However, a symmetric cipher is widely applied for ensuring the security of data communication in constraint devices. In this study, we proposed a hybrid algorithm based on two cryptography algorithms PRESENT and Salsa20. Also, a 2D logistic map of a chaotic system is a
... Show MoreThis paper is concerned with the design and implementation of an image compression method based on biorthogonal tap-9/7 discrete wavelet transform (DWT) and quadtree coding method. As a first step the color correlation is handled using YUV color representation instead of RGB. Then, the chromatic sub-bands are downsampled, and the data of each color band is transformed using wavelet transform. The produced wavelet sub-bands are quantized using hierarchal scalar quantization method. The detail quantized coefficient is coded using quadtree coding followed by Lempel-Ziv-Welch (LZW) encoding. While the approximation coefficients are coded using delta coding followed by LZW encoding. The test results indicated that the compression results are com
... Show MoreThis work aimed to design and testing of a computer program – based eyeQ improvement, photographic memory enhancement, and speed reading to match the reading speed 150 – 250 word per minute (WPM) with the mind ability of processing and eye snap shooting 5000WPM . The package designed based on Visual Basic 6. The efficiency of the designed program was tested on a 10 persons with different levels of education and ages and the results show an increase in their reading speed of approximately 25% in the first month of training with noticeable enhancement in the memory as well as an increase in the ability to read for longer time without feeling nerves or boring, a nonlinear continuously increase in reading speed is assured after the first mo
... Show MoreIn this study, industrial fiber and polymer mixtures were used for high-speed impact (ballistic) applications where the effects of polymer (epoxy), polymeric
mixture (epoxy + unsaturated polyester), synthetic rubber (polyurethane), Kevlar fiber, polyethylene fiber (ultra High molecular weight) and carbon fiber.
Four successive systems of samples were prepared. the first system component made of (epoxy and 2% graphene and 20 layer of fiber), then ballistic test was
applied, the sample was successful in the test from a distance of 7 m. or more than, by using a pistol personally Glock, Caliber of 9 * 19 mm. The second
system was consisting of (epoxy, 2% graphene, 36 layers of fiber and one layer of hard rubber), it was succeeded