In this paper, we investigate and characterize the effects of multi-channel and rendezvous protocols on the connectivity of dynamic spectrum access networks using percolation theory. In particular, we focus on the scenario where the secondary nodes have plenty of vacant channels to choose from a phenomenon which we define as channel abundance. To cope with the existence of multi-channel, we use two types of rendezvous protocols: naive ones which do not guarantee a common channel and advanced ones which do. We show that, with more channel abundance, even with the use of either type of rendezvous protocols, it becomes difficult for two nodes to agree on a common channel, thereby, potentially remaining invisible to each other. We model this invisibility as a Poisson thinning process and show that invisibility is even more pronounced with channel abundance. Following the disk graph model, we represent the multiple channels as parallel edges in a graph and build a multi-layered graph (MLG) in R2. In order to study the connectivity, we show how percolation occurs in the MLG by coupling it with a typical discrete percolation. Using a Boolean model and the MLG, we study both cases of primaries' absence and presence. For both cases, we define and characterize connectivity of the secondary network in terms of the available number of channels, deployment densities, number of simultaneous transmissions per node, and communication range. When primary users are absent, we derive the critical number of channels which maintains supercriticality of the secondary network. When primary users are present, we characterize and analyze the connectivity for all the regions: channel abundance, optimal, and channel deprivation. For each region we show the requirement and the outcome of using either type of rendezvous techniques. Moreover, we find the tradeoff between deployment-density versus rendezvous probability which results in a connected network. Our results can be used to decide on the goodness of any channel rendezvous algorithm by computing the expected resultant connectivity. They also provide a guideline for achieving connectivity using minimal resources.
In recent years, the field of research around the congestion problem of 4G and 5G networks has grown, especially those based on artificial intelligence (AI). Although 4G with LTE is seen as a mature technology, there is a continuous improvement in the infrastructure that led to the emergence of 5G networks. As a result of the large services provided in industries, Internet of Things (IoT) applications and smart cities, which have a large amount of exchanged data, a large number of connected devices per area, and high data rates, have brought their own problems and challenges, especially the problem of congestion. In this context, artificial intelligence (AI) models can be considered as one of the main techniques that can be used to solve ne
... Show MoreFinding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show More<p>Currently, breast cancer is one of the most common cancers and a main reason of women death worldwide particularly in<strong> </strong>developing countries such as Iraq. our work aims to predict the type of tumor whether benign or malignant through models that were built using logistic regression and neural networks and we hope it will help doctors in detecting the type of breast tumor. Four models were set using binary logistic regression and two different types of artificial neural networks namely multilayer perceptron MLP and radial basis function RBF. Evaluation of validated and trained models was done using several performance metrics like accuracy, sensitivity, specificity, and AUC (area under receiver ope
... Show MoreNowadays, 3D content is becoming an essential part of multimedia applications, when the 3D content is not protected, hackers may attack and steal it. This paper introduces a proposed scheme that provides high protection for 3D content by implementing multiple levels of security with preserving the original size using weight factor (w). First level of security is implemented by encrypting the texture map based on a 2D Logistic chaotic map. Second level is implemented by shuffling vertices (confusion) based on a 1D Tent chaotic map. Third level is implemented by modifying the vertices values (diffusion) based on a 3D Lorenz chaotic map. Results illustrate that the proposed scheme is completely deform the entire 3D content accord
... Show MoreResearchers are increasingly using multimodal biometrics to strengthen the security of biometric applications. In this study, a strong multimodal human identification model was developed to address the growing problem of spoofing attacks in biometric security systems. Through the use of metaheuristic optimization methods, such as the Genetic Algorithm(GA), Ant Colony Optimization(ACO), and Particle Swarm Optimization (PSO) for feature selection, this unique model incorporates three biometric modalities: face, iris, and fingerprint. Image pre-processing, feature extraction, critical image feature selection, and multibiometric recognition are the four main steps in the workflow of the system. To determine its performance, the model wa
... Show MoreMulti-walled carbon nanotubes from cheap tubs company MWCNT-CP were purified by alcohol \ H2O2 \ separation funnel which is simple, easy and scalable techniques. The steps of purification were characterized by X-ray diffraction, Raman spectroscopy, scanning electron microscopy SEM with energy dispersive of X-ray spectroscopy EDX and surface area measurements. The technique was succeeded to remove most the trace element from MWCNT-CP which causing increase the surface area. The ratios of impurities were reduced to less 0.6% after treatment by three steps with losing less than 5% from MWCNT-CP.