Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad classes of arrival and service distributions. The benefits of the derived service curve are illustrated for the Exponentially Bounded Burstiness (EBB) traffic model. It is shown that end-to-end performance measures computed with a network service curve are bounded by O(H logH), where H is the number of nodes traversed by a flow. Using currently available techniques that compute end-to-end bounds by adding single node results, the corresponding performance measures are bounded by O(H3).
The concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show MoreThis study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreThis study was aimed to investigate the response surface methodology (RSM) to evaluate the effects of various experimental conditions on the removal of levofloxacin (LVX) from the aqueous solution by means of electrocoagulation (EC) technique with stainless steel electrodes. The EC process was achieved successfully with the efficiency of LVX removal of 90%. The results obtained from the regression analysis, showed that the data of experiential are better fitted to the polynomial model of second-order with the predicted correlation coefficient (pred. R2) of 0.723, adjusted correlation coefficient (Adj. R2) of 0.907 and correlation coefficient values (R2) of 0.952. This shows that the predicted models and experimental values are in go
... Show MoreInformation from 54 Magnetic Resonance Imaging (MRI) brain tumor images (27 benign and 27 malignant) were collected and subjected to multilayer perceptron artificial neural network available on the well know software of IBM SPSS 17 (Statistical Package for the Social Sciences). After many attempts, automatic architecture was decided to be adopted in this research work. Thirteen shape and statistical characteristics of images were considered. The neural network revealed an 89.1 % of correct classification for the training sample and 100 % of correct classification for the test sample. The normalized importance of the considered characteristics showed that kurtosis accounted for 100 % which means that this variable has a substantial effect
... Show MoreIn this paper we study the effect of the number of training samples for Artificial neural networks ( ANN ) which is necessary for training process of feed forward neural network .Also we design 5 Ann's and train 41 Ann's which illustrate how good the training samples that represent the actual function for Ann's.
This research examines the quantitative analysis to assess the efficiency of the transport network in Sadr City, where the study area suffers from a large traffic movement for the variability of traffic flow and intensity at peak hours as a result of inside traffic and outside of it, especially in the neighborhoods of population with economic concentration. &n
... Show MoreDeep submicron technologies continue to develop according to Moore’s law allowing hundreds of processing elements and memory modules to be integrated on a single chip forming multi/many-processor systems-on-chip (MPSoCs). Network on chip (NoC) arose as an interconnection for this large number of processing modules. However, the aggressive scaling of transistors makes NoC more vulnerable to both permanent and transient faults. Permanent faults persistently affect the circuit functionality from the time of their occurrence. The router represents the heart of the NoC. Thus, this research focuses on tolerating permanent faults in the router’s input buffer component, particularly the virtual channel state fields. These fields track packets f
... Show MoreThe research aims to analyze the impact of exchange rate fluctuations (EXM and EXN) and inflation (INF) on the gross domestic product (GDP) in Iraq for the period 1988-2020. The research is important by analyzing the magnitude of the macroeconomic and especially GDP effects of these variables, as well as the economic effects of exchange rates on economic activity. The results of the standard analysis using the ARDL model showed a long-term equilibrium relationship, according to the Bound Test methodology, from explanatory (independent) variables to the internal (dependent) variable, while the value of the error correction vector factor was negative and moral at a level less than (1%). The relationship bet
... Show MoreDue to that the Ultra Wide Band (UWB) technology has some attractive features like robustness to multipath fading, high data rate, low cost and low power consumption, it is widely use to implement cognitive radio network. Intuitively, one of the most important tasks required for cognitive network is the spectrum sensing. A framework for implementing spectrum sensing for UWB-Cognitive Network will be presented in this paper. Since the information about primary licensed users are known to the cognitive radios then the best spectrum sensing scheme for UWB-cognitive network is the matched filter detection scheme. Simulation results verified and demonstrated the using of matched filter spectrum sensing in cognitive radio network with UWB and pro
... Show More