In recent years, Wireless Sensor Networks (WSNs) are attracting more attention in many fields as they are extensively used in a wide range of applications, such as environment monitoring, the Internet of Things, industrial operation control, electric distribution, and the oil industry. One of the major concerns in these networks is the limited energy sources. Clustering and routing algorithms represent one of the critical issues that directly contribute to power consumption in WSNs. Therefore, optimization techniques and routing protocols for such networks have to be studied and developed. This paper focuses on the most recent studies and algorithms that handle energy-efficiency clustering and routing in WSNs. In addition, the prime issues in these networks are discussed and summarized using comparison tables, including the main features, limitations, and the kind of simulation toolbox. Energy efficiency is compared between some techniques and showed that according to clustering mode “Distributed” and CH distribution “Uniform”, HEED and EECS are best, while in the non-uniform clustering, both DDAR and THC are efficient. According to clustering mode “Centralized” and CH distribution “Uniform”, the LEACH-C protocol is more effective.
The occurrences of invasive candidiasis has increased over the previous few decades. Although Candida albicans considers as one of the most common species of organisms, that cause acquired fungal infections. Candida albicans is an opportunistic fungal pathogen and inherent in as a lifelong, the yeast is present in healthy individuals as a commensal, and can reside harmlessly in human body. However, in immuno-compromised individuals, the fungus can invade tissues, producing superficial infections and, in severe cases, life-threatening systemic infections. This review wills emphasis on virulence factor of C. albicans including (adhesion, invasion, candida proteinase, and phenotypic switching and biofilm formation. I
... Show MoreThis paper presents the application of a framework of fast and efficient compressive sampling based on the concept of random sampling of sparse Audio signal. It provides four important features. (i) It is universal with a variety of sparse signals. (ii) The number of measurements required for exact reconstruction is nearly optimal and much less then the sampling frequency and below the Nyquist frequency. (iii) It has very low complexity and fast computation. (iv) It is developed on the provable mathematical model from which we are able to quantify trade-offs among streaming capability, computation/memory requirement and quality of reconstruction of the audio signal. Compressed sensing CS is an attractive compression scheme due to its uni
... Show MoreIn this paper, new method have been investigated using evolving algorithms (EA's) to cryptanalysis one of the nonlinear stream cipher cryptosystems which depends on the Linear Feedback Shift Register (LFSR) unit by using cipher text-only attack. Genetic Algorithm (GA) and Ant Colony Optimization (ACO) which are used for attacking one of the nonlinear cryptosystems called "shrinking generator" using different lengths of cipher text and different lengths of combined LFSRs. GA and ACO proved their good performance in finding the initial values of the combined LFSRs. This work can be considered as a warning for a stream cipher designer to avoid the weak points, which may be f
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreIn cyber security, the most crucial subject in information security is user authentication. Robust text-based password methods may offer a certain level of protection. Strong passwords are hard to remember, though, so people who use them frequently write them on paper or store them in file for computer .Numerous of computer systems, networks, and Internet-based environments have experimented with using graphical authentication techniques for user authentication in recent years. The two main characteristics of all graphical passwords are their security and usability. Regretfully, none of these methods could adequately address both of these factors concurrently. The ISO usability standards and associated characteristics for graphical
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show Moreconventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.
With the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreOne of the main causes for concern is the widespread presence of pharmaceuticals in the environment, which may be harmful to living things. They are often referred to as emerging chemical pollutants in water bodies because they are either still unregulated or undergoing regulation. Pharmaceutical pollution of the environment may have detrimental effects on ecosystem viability, human health, and water quality. In this study, the amount of remaining pharmaceutical compounds in environmental waters was determined using a straightforward review. Pharmaceutical production and consumption have increased due to medical advancements, leading to concerns about their environmental impact and potential harm to living things due to their increa
... Show More