In this paper, we investigate and characterize the effects of multi-channel and rendezvous protocols on the connectivity of dynamic spectrum access networks using percolation theory. In particular, we focus on the scenario where the secondary nodes have plenty of vacant channels to choose from a phenomenon which we define as channel abundance. To cope with the existence of multi-channel, we use two types of rendezvous protocols: naive ones which do not guarantee a common channel and advanced ones which do. We show that, with more channel abundance, even with the use of either type of rendezvous protocols, it becomes difficult for two nodes to agree on a common channel, thereby, potentially remaining invisible to each other. We model this invisibility as a Poisson thinning process and show that invisibility is even more pronounced with channel abundance. Following the disk graph model, we represent the multiple channels as parallel edges in a graph and build a multi-layered graph (MLG) in R2. In order to study the connectivity, we show how percolation occurs in the MLG by coupling it with a typical discrete percolation. Using a Boolean model and the MLG, we study both cases of primaries' absence and presence. For both cases, we define and characterize connectivity of the secondary network in terms of the available number of channels, deployment densities, number of simultaneous transmissions per node, and communication range. When primary users are absent, we derive the critical number of channels which maintains supercriticality of the secondary network. When primary users are present, we characterize and analyze the connectivity for all the regions: channel abundance, optimal, and channel deprivation. For each region we show the requirement and the outcome of using either type of rendezvous techniques. Moreover, we find the tradeoff between deployment-density versus rendezvous probability which results in a connected network. Our results can be used to decide on the goodness of any channel rendezvous algorithm by computing the expected resultant connectivity. They also provide a guideline for achieving connectivity using minimal resources.
In this research the results of applying Artificial Neural Networks with modified activation function to
perform the online and offline identification of four Degrees of Freedom (4-DOF) Selective Compliance
Assembly Robot Arm (SCARA) manipulator robot will be described. The proposed model of
identification strategy consists of a feed-forward neural network with a modified activation function that
operates in parallel with the SCARA robot model. Feed-Forward Neural Networks (FFNN) which have
been trained online and offline have been used, without requiring any previous knowledge about the
system to be identified. The activation function that is used in the hidden layer in FFNN is a modified
version of the wavelet func
With the spread use of internet, especially the web of social media, an unusual quantity of information is found that includes a number of study fields such as psychology, entertainment, sociology, business, news, politics, and other cultural fields of nations. Data mining methodologies that deal with social media allows producing enjoyable scene on the human behaviour and interaction. This paper demonstrates the application and precision of sentiment analysis using traditional feedforward and two of recurrent neural networks (gated recurrent unit (GRU) and long short term memory (LSTM)) to find the differences between them. In order to test the system’s performance, a set of tests is applied on two public datasets. The firs
... Show MoreThe OpenStreetMap (OSM) project aims to establish a free geospatial database for the entire world which is editable by international volunteers. The OSM database contains a wide range of different types of geographical data and characteristics, including highways, buildings, and land use regions. The varying scientific backgrounds of the volunteers can affect the quality of the spatial data that is produced and shared on the internet as an OSM dataset. This study aims to compare the completeness and attribute accuracy of the OSM road networks with the data supplied by a digitizing process for areas in the Baghdad and Thi-Qar governorates. The analyses are primarily based on calculating the portion of the commission (extr
... Show More<p>Vehicular ad-hoc networks (VANET) suffer from dynamic network environment and topological instability that caused by high mobility feature and varying vehicles density. Emerging 5G mobile technologies offer new opportunities to design improved VANET architecture for future intelligent transportation system. However, current software defined networking (SDN) based handover schemes face poor handover performance in VANET environment with notable issues in connection establishment and ongoing communication sessions. These poor connectivity and inflexibility challenges appear at high vehicles speed and high data rate services. Therefore, this paper proposes a flexible handover solution for VANET networks by integrating SDN and
... Show MoreThe OpenStreetMap (OSM) project aims to establish a free geospatial database for the entire world which is editable by international volunteers. The OSM database contains a wide range of different types of geographical data and characteristics, including highways, buildings, and land use regions. The varying scientific backgrounds of the volunteers can affect the quality of the spatial data that is produced and shared on the internet as an OSM dataset. This study aims to compare the completeness and attribute accuracy of the OSM road networks with the data supplied by a digitizing process for areas in the Baghdad and Thi-Qar governorates. The analyses are primarily based on calculating the portion of the commission (extra road) and
... Show MoreThis paper set forth the spatial suitability of the informal settlement supposed to be distributed by the Iraqis government to poor people. The Iraqi government identified 9 locations of informal settlement in Baghdad city and acceptance it as a reality as a help for them to getting home. In this paper I discovered the suitability of those locations which one will be suitable more than others for living. The analysis process was applied using the GIS environment – spatial analysis. According to the results, It has been identified as the most important measures to identify which one of these areas suitable for development for housing by using some criteria (Distance from the city center, Proximity from transport routes, Proximity of high v
... Show MoreThe approach of the research is to simulate residual chlorine decay through potable water distribution networks of Gukookcity. EPANET software was used for estimating and predicting chlorine concentration at different water network points . Data requiredas program inputs (pipe properties) were taken from the Baghdad Municipality, factors that affect residual chlorine concentrationincluding (pH ,Temperature, pressure ,flow rate) were measured .Twenty five samples were tested from November 2016 to July 2017.The residual chlorine values varied between ( 0.2-2mg/L) , and pH values varied between (7.6 -8.2) and the pressure was very weak inthis region. Statistical analyses were used to evaluated errors. The calculated concentrations by the calib
... Show MoreThis work proposes a new video buffer framework (VBF) to acquire a favorable quality of experience (QoE) for video streaming in cellular networks. The proposed framework consists of three main parts: client selection algorithm, categorization method, and distribution mechanism. The client selection algorithm was named independent client selection algorithm (ICSA), which is proposed to select the best clients who have less interfering effects on video quality and recognize the clients’ urgency based on buffer occupancy level. In the categorization method, each frame in the video buffer is given a specific number for better estimation of the playout outage probability, so it can efficiently handle so many frames from different video
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show More