Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad classes of arrival and service distributions. The benefits of the derived service curve are illustrated for the Exponentially Bounded Burstiness (EBB) traffic model. It is shown that end-to-end performance measures computed with a network service curve are bounded by O(H logH), where H is the number of nodes traversed by a flow. Using currently available techniques that compute end-to-end bounds by adding single node results, the corresponding performance measures are bounded by O(H3).
Autism Spectrum Disorder, also known as ASD, is a neurodevelopmental disease that impairs speech, social interaction, and behavior. Machine learning is a field of artificial intelligence that focuses on creating algorithms that can learn patterns and make ASD classification based on input data. The results of using machine learning algorithms to categorize ASD have been inconsistent. More research is needed to improve the accuracy of the classification of ASD. To address this, deep learning such as 1D CNN has been proposed as an alternative for the classification of ASD detection. The proposed techniques are evaluated on publicly available three different ASD datasets (children, Adults, and adolescents). Results strongly suggest that 1D
... Show MoreAbstract
This research aim to overcome the problem of dimensionality by using the methods of non-linear regression, which reduces the root of the average square error (RMSE), and is called the method of projection pursuit regression (PPR), which is one of the methods for reducing dimensions that work to overcome the problem of dimensionality (curse of dimensionality), The (PPR) method is a statistical technique that deals with finding the most important projections in multi-dimensional data , and With each finding projection , the data is reduced by linear compounds overall the projection. The process repeated to produce good projections until the best projections are obtained. The main idea of the PPR is to model
... Show MoreThe economy is exceptionally reliant on agricultural productivity. Therefore, in domain of agriculture, plant infection discovery is a vital job because it gives promising advance towards the development of agricultural production. In this work, a framework for potato diseases classification based on feed foreword neural network is proposed. The objective of this work is presenting a system that can detect and classify four kinds of potato tubers diseases; black dot, common scab, potato virus Y and early blight based on their images. The presented PDCNN framework comprises three levels: the pre-processing is first level, which is based on K-means clustering algorithm to detect the infected area from potato image. The s
... Show MoreMalaysia has been supported by one of the high-speed fiber internet connections called TM UniFi. TM UniFi is very familiar to be used as a medium to apply Small Office Home Office (SOHO) concept due to the COVID-19 pandemic. Most of the communication vendors offer varieties of network services to fulfill customers' needs and satisfaction during the pandemic. Quality of Services is queried by most users by the fact of increased on users from time to time. Therefore, it is crucial to know the network performance contrary to the number of devices connected to the TM UniFi network. The main objective of this research is to analyze TM UniFi performance with the impact of multiple device connections or users' services. The study was conducted
... Show MoreElectrical distribution system loads are permanently not fixed and alter in value and nature with time. Therefore, accurate consumer load data and models are required for performing system planning, system operation, and analysis studies. Moreover, realistic consumer load data are vital for load management, services, and billing purposes. In this work, a realistic aggregate electric load model is developed and proposed for a sample operative substation in Baghdad distribution network. The model involves aggregation of hundreds of thousands of individual components devices such as motors, appliances, and lighting fixtures. Sana’a substation in Al-kadhimiya area supplies mainly residential grade loads. Measurement-based
... Show MoreWhenever, the Internet of Things (IoT) applications and devices increased, the capability of the its access frequently stressed. That can lead a significant bottleneck problem for network performance in different layers of an end point to end point (P2P) communication route. So, an appropriate characteristic (i.e., classification) of the time changing traffic prediction has been used to solve this issue. Nevertheless, stills remain at great an open defy. Due to of the most of the presenting solutions depend on machine learning (ML) methods, that though give high calculation cost, where they are not taking into account the fine-accurately flow classification of the IoT devices is needed. Therefore, this paper presents a new model bas
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show More