Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad classes of arrival and service distributions. The benefits of the derived service curve are illustrated for the Exponentially Bounded Burstiness (EBB) traffic model. It is shown that end-to-end performance measures computed with a network service curve are bounded by O(H logH), where H is the number of nodes traversed by a flow. Using currently available techniques that compute end-to-end bounds by adding single node results, the corresponding performance measures are bounded by O(H3).
The development of microcontroller is used in monitoring and data acquisition recently. This development has born various architectures for spreading and interfacing the microcontroller in network environment. Some of existing architecture suffers from redundant in resources, extra processing, high cost and delay in response. This paper presents flexible concise architecture for building distributed microcontroller networked system. The system consists of only one server, works through the internet, and a set of microcontrollers distributed in different sites. Each microcontroller is connected through the Ethernet to the internet. In this system the client requesting data from certain side is accomplished through just one server that is in
... Show MoreEstimating an individual's age from a photograph of their face is critical in many applications, including intelligence and defense, border security and human-machine interaction, as well as soft biometric recognition. There has been recent progress in this discipline that focuses on the idea of deep learning. These solutions need the creation and training of deep neural networks for the sole purpose of resolving this issue. In addition, pre-trained deep neural networks are utilized in the research process for the purpose of facial recognition and fine-tuning for accurate outcomes. The purpose of this study was to offer a method for estimating human ages from the frontal view of the face in a manner that is as accurate as possible and takes
... Show MoreArtificial Neural networks (ANN) are powerful and effective tools in time-series applications. The first aim of this paper is to diagnose better and more efficient ANN models (Back Propagation, Radial Basis Function Neural networks (RBF), and Recurrent neural networks) in solving the linear and nonlinear time-series behavior. The second aim is dealing with finding accurate estimators as the convergence sometimes is stack in the local minima. It is one of the problems that can bias the test of the robustness of the ANN in time series forecasting. To determine the best or the optimal ANN models, forecast Skill (SS) employed to measure the efficiency of the performance of ANN models. The mean square error and
... Show MoreFiber-to-the-Home (FTTH) has long been recognized as a technology that provides future proof bandwidth [1], but has generally been too expensive to implement on a wide scale. However, reductions in the cost of electro-optic components and improvements in the handling of fiber optics now make FTTH a cost effective solution in many situations. The transition to FTTH in the access network is also a benefit for both consumers and service providers because it opens up the near limitless capacity of the core long-haul network to the local user. In this paper individual passive optical components, transceivers, and fibers has been put together to form a complete FTTH network. Then the implementation of the under construction Baghdad/Al
... Show MoreThe paper presents a neural synchronization into intensive study in order to address challenges preventing from adopting it as an alternative key exchange algorithm. The results obtained from the implementation of neural synchronization with this proposed system address two challenges: namely the verification of establishing the synchronization between the two neural networks, and the public initiation of the input vector for each party. Solutions are presented and mathematical model is developed and presented, and as this proposed system focuses on stream cipher; a system of LFSRs (linear feedback shift registers) has been used with a balanced memory to generate the key. The initializations of these LFSRs are neural weights after achiev
... Show MoreThe Artificial Neural Network methodology is a very important & new subjects that build's the models for Analyzing, Data Evaluation, Forecasting & Controlling without depending on an old model or classic statistic method that describe the behavior of statistic phenomenon, the methodology works by simulating the data to reach a robust optimum model that represent the statistic phenomenon & we can use the model in any time & states, we used the Box-Jenkins (ARMAX) approach for comparing, in this paper depends on the received power to build a robust model for forecasting, analyzing & controlling in the sod power, the received power come from
... Show More
In today's world, most business, regardless of size, believe that access to Internet is imperative if they are going to complete effectively. Yet connecting a private computer (or a network) to the Internet can expose critical or confidential data to malicious attack from anywhere in the world since unprotected connections to the Internet (or any network topology) leaves the user computer vulnerable to hacker attacks and other Internet threats. Therefore, to provide high degree of protection to the network and network's user, Firewall need to be used.
Firewall provides a barrier between the user computer and the Internet (i.e. it prevents unauthor
... Show MoreIn this paper, a Modified Weighted Low Energy Adaptive Clustering Hierarchy (MW-LEACH) protocol is implemented to improve the Quality of Service (QoS) in Wireless Sensor Network (WSN) with mobile sink node. The Quality of Service is measured in terms of Throughput Ratio (TR), Packet Loss Ratio (PLR) and Energy Consumption (EC). The protocol is implemented based on Python simulation. Simulation Results showed that the proposed protocol provides better Quality of Service in comparison with Weighted Low Energy Cluster Hierarchy (W-LEACH) protocol by 63%.
Lung cancer is one of the most serious and prevalent diseases, causing many deaths each year. Though CT scan images are mostly used in the diagnosis of cancer, the assessment of scans is an error-prone and time-consuming task. Machine learning and AI-based models can identify and classify types of lung cancer quite accurately, which helps in the early-stage detection of lung cancer that can increase the survival rate. In this paper, Convolutional Neural Network is used to classify Adenocarcinoma, squamous cell carcinoma and normal case CT scan images from the Chest CT Scan Images Dataset using different combinations of hidden layers and parameters in CNN models. The proposed model was trained on 1000 CT Scan Images of cancerous and non-c
... Show More