In this paper, we investigate and characterize the effects of multi-channel and rendezvous protocols on the connectivity of dynamic spectrum access networks using percolation theory. In particular, we focus on the scenario where the secondary nodes have plenty of vacant channels to choose from a phenomenon which we define as channel abundance. To cope with the existence of multi-channel, we use two types of rendezvous protocols: naive ones which do not guarantee a common channel and advanced ones which do. We show that, with more channel abundance, even with the use of either type of rendezvous protocols, it becomes difficult for two nodes to agree on a common channel, thereby, potentially remaining invisible to each other. We model this invisibility as a Poisson thinning process and show that invisibility is even more pronounced with channel abundance. Following the disk graph model, we represent the multiple channels as parallel edges in a graph and build a multi-layered graph (MLG) in R2. In order to study the connectivity, we show how percolation occurs in the MLG by coupling it with a typical discrete percolation. Using a Boolean model and the MLG, we study both cases of primaries' absence and presence. For both cases, we define and characterize connectivity of the secondary network in terms of the available number of channels, deployment densities, number of simultaneous transmissions per node, and communication range. When primary users are absent, we derive the critical number of channels which maintains supercriticality of the secondary network. When primary users are present, we characterize and analyze the connectivity for all the regions: channel abundance, optimal, and channel deprivation. For each region we show the requirement and the outcome of using either type of rendezvous techniques. Moreover, we find the tradeoff between deployment-density versus rendezvous probability which results in a connected network. Our results can be used to decide on the goodness of any channel rendezvous algorithm by computing the expected resultant connectivity. They also provide a guideline for achieving connectivity using minimal resources.
Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over
... Show MoreMulti-walled carbon nanotubes from cheap tubs company MWCNT-CP were purified by alcohol \ H2O2 \ separation funnel which is simple, easy and scalable techniques. The steps of purification were characterized by X-ray diffraction, Raman spectroscopy, scanning electron microscopy SEM with energy dispersive of X-ray spectroscopy EDX and surface area measurements. The technique was succeeded to remove most the trace element from MWCNT-CP which causing increase the surface area. The ratios of impurities were reduced to less 0.6% after treatment by three steps with losing less than 5% from MWCNT-CP.
Multi-point forming (MPF) is an advanced flexible manufacture technology, and the technology results from the idea that the whole die is separated into small punches that can be adjusted height. This idea is applied to the traditional rigid blank-holder, so flexible blank-holder (FBH) idea can be obtained. In this work, the performance of a multi-point die is investigated with pins in square matrix and suitable blank holder. Each pin in the punch holder can be a significant moved according to the die high and at different load that applied with spring with respect to spring stiffness. The results shows the reduction in setting time with respect to traditional single point incremental forming process that lead to (90%). and also show duri
... Show MoreIn unpredicted industrial environment, being able to adapt quickly and effectively to the changing is key in gaining a competitive advantage in the global market. Agile manufacturing evolves new ways of running factories to react quickly and effectively to changing markets, driven by customized requirement. Agility in manufacturing can be successfully achieved via integration of information system, people, technologies, and business processes. This article presents the conceptual model of agility in three dimensions named: driving factor, enabling technologies and evaluation of agility in manufacturing system. The conceptual model was developed based on a review of the literature. Then, the paper demonstrates the agility
... Show MoreResearchers are increasingly using multimodal biometrics to strengthen the security of biometric applications. In this study, a strong multimodal human identification model was developed to address the growing problem of spoofing attacks in biometric security systems. Through the use of metaheuristic optimization methods, such as the Genetic Algorithm(GA), Ant Colony Optimization(ACO), and Particle Swarm Optimization (PSO) for feature selection, this unique model incorporates three biometric modalities: face, iris, and fingerprint. Image pre-processing, feature extraction, critical image feature selection, and multibiometric recognition are the four main steps in the workflow of the system. To determine its performance, the model wa
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show More