In this paper, we investigate and characterize the effects of multi-channel and rendezvous protocols on the connectivity of dynamic spectrum access networks using percolation theory. In particular, we focus on the scenario where the secondary nodes have plenty of vacant channels to choose from a phenomenon which we define as channel abundance. To cope with the existence of multi-channel, we use two types of rendezvous protocols: naive ones which do not guarantee a common channel and advanced ones which do. We show that, with more channel abundance, even with the use of either type of rendezvous protocols, it becomes difficult for two nodes to agree on a common channel, thereby, potentially remaining invisible to each other. We model this invisibility as a Poisson thinning process and show that invisibility is even more pronounced with channel abundance. Following the disk graph model, we represent the multiple channels as parallel edges in a graph and build a multi-layered graph (MLG) in R2. In order to study the connectivity, we show how percolation occurs in the MLG by coupling it with a typical discrete percolation. Using a Boolean model and the MLG, we study both cases of primaries' absence and presence. For both cases, we define and characterize connectivity of the secondary network in terms of the available number of channels, deployment densities, number of simultaneous transmissions per node, and communication range. When primary users are absent, we derive the critical number of channels which maintains supercriticality of the secondary network. When primary users are present, we characterize and analyze the connectivity for all the regions: channel abundance, optimal, and channel deprivation. For each region we show the requirement and the outcome of using either type of rendezvous techniques. Moreover, we find the tradeoff between deployment-density versus rendezvous probability which results in a connected network. Our results can be used to decide on the goodness of any channel rendezvous algorithm by computing the expected resultant connectivity. They also provide a guideline for achieving connectivity using minimal resources.
In this study, phytoplankton density, chlorophyll-a, and selected physico- chemical parameters were investigated in Erbil wastewater channel. The surveys were carried out monthly from May 2003 to April 2004. Samplings were established on three sites from headwaters to the mouth. The results showed that pH was in alkaline side of neutrality, with significant differences (P<0.05) between sites 1 and 3. TSS concentration decreased from site 1 toward site 2 (mean value, 80.15 to 25.79 mg.l-1). A clear gradual increase in mineral content (TDS) observed from site one of the channel towards the mouthpart. Soluble reactive phosphate has a concentration maximum mean value reached 48.4 µg.l-1 which is recorded in site 2. A high positive relat
... Show MoreThis work presents the use of laser diode in the fiber distributed data interface FDDI networks. FDDI uses optical fiber as a transmission media. This solves the problems resulted from the EMI, and noise. In addition it increases the security of transmission. A network with a ring topology consists of three computers was designed and implemented. The timed token protocol was used to achieve and control the process of communication over the ring. Nonreturn to zero inversion (NRZI) modulation was carried out as a part of the physical (PHY) sublayer. The optical system consists of a laser diode with wavelength of 820 nm and 2.5 mW maximum output power as a source, optical fiber as a channel, and positive intrinsic negative (PIN) photodiode
... Show MoreMultiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show More<p><span>Medium access control (MAC) protocol design plays a crucial role to increase the performance of wireless communications and networks. The channel access mechanism is provided by MAC layer to share the medium by multiple stations. Different types of wireless networks have different design requirements such as throughput, delay, power consumption, fairness, reliability, and network density, therefore, MAC protocol for these networks must satisfy their requirements. In this work, we proposed two multiplexing methods for modern wireless networks: Massive multiple-input-multiple-output (MIMO) and power domain non-orthogonal multiple access (PD-NOMA). The first research method namely Massive MIMO uses a massive numbe
... Show MoreTechnically, mobile P2P network system architecture can consider as a distributed architecture system (like a community), where the nodes or users can share all or some of their own software and hardware resources such as (applications store, processing time, storage, network bandwidth) with the other nodes (users) through Internet, and these resources can be accessible directly by the nodes in that system without the need of a central coordination node. The main structure of our proposed network architecture is that all the nodes are symmetric in their functions. In this work, the security issues of mobile P2P network system architecture such as (web threats, attacks and encryption) will be discussed deeply and then we prop
... Show More<p> Traditionally, wireless networks and optical fiber Networks are independent of each other. Wireless networks are designed to meet specific service requirements, while dealing with weak physical transmission, and maximize system resources to ensure cost effectiveness and satisfaction for the end user. In optical fiber networks, on the other hand, search efforts instead concentrated on simple low-cost, future-proofness against inheritance and high services and applications through optical transparency. The ultimate goal of providing access to information when needed, was considered significantly. Whatever form it is required, not only increases the requirement sees technology convergence of wireless and optical networks but
... Show MoreThere are large numbers of weakness in the generated keys of security algorithms. This paper includes a new algorithm to generate key of 5120 bits for a new proposed cryptography algorithm for 10 rounds that combine neural networks and chaos theory (1D logistic map). Two methods of neural networks (NN) are employed as Adaline and Hopfield and the results are combined through several sequential operation. Carefully integrating high quality random number generators from neural networks and chaos theory to obtain suitable key for randomness and complexity.
In this paper we give definitions, properties and examples of the notion of type Ntopological space. Throughout this paper N is a finite positive number, N 2. The task of this paper is to study and investigate some properties of such spaces with the existence of a relation between this space and artificial Neural Networks (ïNN'S), that is we applied the definition of this space in computer field and specially in parallel processing
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show More