Iris detection is considered as challenging image processing task. In this study efficient method was suggested to detect iris and recognition it. This method depending on seed filling algorithm and circular area detection, where the color image converted to gray image, and then the gray image is converted to binary image. The seed filling is applied of the binary image and the position of detected object binary region (ROI) is localized in term of it is center coordinates are radii (i.e., the inner and out radius). To find the localization efficiency of suggested method has been used the coefficient of variation (CV) for radius iris for evaluation. The test results indicated that is suggested method is good for the iris detection.
In this work a fragile watermarking scheme is presented. This scheme is applied to digital color images in spatial domain. The image is divided into blocks, and each block has its authentication mark embedded in it, we would be able to insure which parts of the image are authentic and which parts have been modified. This authentication carries out without need to exist the original image. The results show the quality of the watermarked image is remaining very good and the watermark survived some type of unintended modification such as familiar compression software like WINRAR and ZIP
As a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreThe advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages
... Show MoreIn this research work, some low complexity and efficient cryptanalysis approaches are proposed to decrypt password (encryption keys). Passwords are still one of the most common means of securing computer systems. Most organizations rely on password authentication systems, and therefore, it is very important for them to enforce their users to have strong passwords. They usually ignore the importance of usability of the password for the users. The more complex they are the more they frustrate users and they end up with some coping strategies such as adding “123” at the end of their passwords or repeating a word to make their passwords longer, which reduces the security of the password, and more importantly there is no scientific basis
... Show MoreSansevieriatrifasciata was studied as a potential biosorbent for chromium, copper and nickel removal in batch process from electroplating and tannery effluents. Different parameters influencing the biosorption process such as pH, contact time, and amount of biosorbent were optimized while using the 80 mm sized particles of the biosorbent. As high as 91.3 % Ni and 92.7 % Cu were removed at pH of 6 and 4.5 respectively, while optimum Cr removal of 91.34 % from electroplating and 94.6 % from tannery effluents was found at pH 6.0 and 4.0 respectively. Pseudo second order model was found to best fit the kinetic data for all the metals as evidenced by their greater R2 values. FTIR characterization of biosorbent revealed the presence of carboxyl a
... Show MoreSemantic segmentation is effective in numerous object classification tasks such as autonomous vehicles and scene understanding. With the advent in the deep learning domain, lots of efforts are seen in applying deep learning algorithms for semantic segmentation. Most of the algorithms gain the required accuracy while compromising on their storage and computational requirements. The work showcases the implementation of Convolutional Neural Network (CNN) using Discrete Cosine Transform (DCT), where DCT exhibit exceptional energy compaction properties. The proposed Adaptive Weight Wiener Filter (AWWF) rearranges the DCT coefficients by truncating the high frequency coefficients. AWWF-DCT model reinstate the convolutional l
... Show MoreThis article co;nsiders a shrunken estimator ·Of Al-Hermyari· and
AI Gobuii (.1) to estimate the mean (8) of a normal clistributicm N (8 cr4) with known variance (cr+), when <:I guess value (So) av11il ble about the mean (B) as· an initial estrmate. This estimator is shown to be
more efficient tl1an the class-ical estimators especially when 8 is close to 8•. General expressions .for bias and MSE -of considered estitnator are gi 'en, witeh some examples. Nut.nerical cresdlts, comparisons and
conclusions ate reported.
Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More