Finding the shortest route in wireless mesh networks is an important aspect. Many techniques are used to solve this problem like dynamic programming, evolutionary algorithms, weighted-sum techniques, and others. In this paper, we use dynamic programming techniques to find the shortest path in wireless mesh networks due to their generality, reduction of complexity and facilitation of numerical computation, simplicity in incorporating constraints, and their onformity to the stochastic nature of some problems. The routing problem is a multi-objective optimization problem with some constraints such as path capacity and end-to-end delay. Single-constraint routing problems and solutions using Dijkstra, Bellman-Ford, and Floyd-Warshall algorithms are proposed in this work with a discussion on the difference between them. These algorithms find the shortest route through finding the optimal rate between two nodes in the wireless networks but with bounded end-to-end delay. The Dijkstra-based algorithm is especially favorable in terms of processing time. We also present a comparison between our proposed single-constraint Dijkstra-based routing algorithm and the mesh routing algorithm (MRA) existing in the literature to clarify the merits of the former.
The study aims to predict Total Dissolved Solids (TDS) as a water quality indicator parameter at spatial and temporal distribution of the Tigris River, Iraq by using Artificial Neural Network (ANN) model. This study was conducted on this river between Mosul and Amarah in Iraq on five positions stretching along the river for the period from 2001to 2011. In the ANNs model calibration, a computer program of multiple linear regressions is used to obtain a set of coefficient for a linear model. The input parameters of the ANNs model were the discharge of the Tigris River, the year, the month and the distance of the sampling stations from upstream of the river. The sensitivity analysis indicated that the distance and discharge
... Show MoreAssimilation is defined ,by many phoneticians like Schane ,Roach ,and many others, as a phonological process when there is a change of one sound into another because of neighboring sounds.This study investigates the phoneme assimilation as a phonological process in English and Arabic and it is concerned specifically with the differences and similarities in both languages. Actually ,this study reflects the different terms which are used in Arabic to refer to this phenomenon and in this way it shows whether the term 'assimilation ' can have the same meaning of 'idgham' in Arabic or not . Besides, in Arabic , this phenomenon is discussed from&nb
... Show MoreIn their growth stages, cities become an aggregation of different urban contexts as a result of development or investment projects with other goals, which creates urban tension at several levels. Previous studies presented different approaches and methods to address specific aspects of urban stress, and thus contemporary visions and propositions varied, which required a field for research. The research, from a review of the proposals, the research problem emerged in need to study the indicators and trends of balanced urban development that address the tensions between different social, economic and urban contexts". Accordingly, the objective of the research is determined as "Building a comprehe
... Show MoreIn this paper, we investigate and characterize the effects of multi-channel and rendezvous protocols on the connectivity of dynamic spectrum access networks using percolation theory. In particular, we focus on the scenario where the secondary nodes have plenty of vacant channels to choose from a phenomenon which we define as channel abundance. To cope with the existence of multi-channel, we use two types of rendezvous protocols: naive ones which do not guarantee a common channel and advanced ones which do. We show that, with more channel abundance, even with the use of either type of rendezvous protocols, it becomes difficult for two nodes to agree on a common channel, thereby, potentially remaining invisible to each other. We model this in
... Show MoreIn this research, damping properties for composite materials were evaluated using logarithmic decrement method to study the effect of reinforcements on the damping ratio of the epoxy matrix. Three stages of composites were prepared in this research. The first stage included preparing binary blends of epoxy (EP) and different weight percentages of polysulfide rubber (PSR) (0%, 2.5%, 5%, 7.5% and 10%). It was found that the weight percentage 5% of polysulfide was the best percentage, which gives the best mechanical properties for the blend matrix. The advantage of this blend matrix is that; it mediates between the brittle properties of epoxy and the flexible properties of a blend matrix with the highest percentage of PSR. The second stage
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show MoreWith the continuous downscaling of semiconductor processes, the growing power density and thermal issues in multicore processors become more and more challenging, thus reliable dynamic thermal management (DTM) is required to prevent severe challenges in system performance. The accuracy of the thermal profile, delivered to the DTM manager, plays a critical role in the efficiency and reliability of DTM, different sources of noise and variations in deep submicron (DSM) technologies severely affecting the thermal data that can lead to significant degradation of DTM performance. In this article, we propose a novel fault-tolerance scheme exploiting approximate computing to mitigate the DSM effects on DTM efficiency. Approximate computing in hardw
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show More