The most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the client at the network boundary. The present research paper proposes priority weighted round robin (PWRR) algorithm for streaming operations scheduling in the fog architecture. This will give preemptive for streaming live video request to be delivered in a very short response time and real-time communication. The results of experimenting the PWRR in the proposed architecture display a minimize latency and good quality of live video requests which has been achieved with bandwidth changes as well as meeting all other clients requests at the same time
A multidimensional systolic arrays realization of LMS algorithm by a method of mapping regular algorithm onto processor array, are designed. They are based on appropriately selected 1-D systolic array filter that depends on the inner product sum systolic implementation. Various arrays may be derived that exhibit a regular arrangement of the cells (processors) and local interconnection pattern, which are important for VLSI implementation. It reduces latency time and increases the throughput rate in comparison to classical 1-D systolic arrays. The 3-D multilayered array consists of 2-D layers, which are connected with each other only by edges. Such arrays for LMS-based adaptive (FIR) filter may be opposed the fundamental requirements of fa
... Show MoreNowadays, the power plant is changing the power industry from a centralized and vertically integrated form into regional, competitive and functionally separate units. This is done with the future aims of increasing efficiency by better management and better employment of existing equipment and lower price of electricity to all types of customers while retaining a reliable system. This research is aimed to solve the optimal power flow (OPF) problem. The OPF is used to minimize the total generations fuel cost function. Optimal power flow may be single objective or multi objective function. In this thesis, an attempt is made to minimize the objective function with keeping the voltages magnitudes of all load buses, real outp
... Show MoreTask scheduling in an important element in a distributed system. It is vital how the jobs are correctly assigned for each computer’s processor to improve performance. The presented approaches attempt to reduce the expense of optimizing the use of the CPU. These techniques mostly lack planning and in need to be comprehensive. To address this fault, a hybrid optimization scheduling technique is proposed for the hybridization of both First-Come First-Served (FCFS), and Shortest Job First (SJF). In addition, we propose to apply Simulated Annealing (SA) algorithm as an optimization technique to find optimal job’s execution sequence considering both job’s entrance time and job’s execution time to balance them to reduce the job
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
In recent years, encryption technology has been developed rapidly and many image encryption methods have been put forward. The chaos-based image encryption technique is a modern encryption system for images. To encrypt images, it uses random sequence chaos, which is an efficient way to solve the intractable problem of simple and highly protected image encryption. There are, however, some shortcomings in the technique of chaos-based image encryption, such limited accuracy issue. The approach focused on the chaotic system in this paper is to construct a dynamic IP permutation and S-Box substitution by following steps. First of all, use of a new IP table for more diffusion of al
... Show MoreMany of accurate inertial guided missilc systems need to use more complex mathematical calculations and require a high speed processing to ensure the real-time opreation. This will give rise to the need of developing an effcint
This paper proposed a new method to study functional non-parametric regression data analysis with conditional expectation in the case that the covariates are functional and the Principal Component Analysis was utilized to de-correlate the multivariate response variables. It utilized the formula of the Nadaraya Watson estimator (K-Nearest Neighbour (KNN)) for prediction with different types of the semi-metrics, (which are based on Second Derivative and Functional Principal Component Analysis (FPCA)) for measureing the closeness between curves. Root Mean Square Errors is used for the implementation of this model which is then compared to the independent response method. R program is used for analysing data. Then, when the cov
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreThe proposal of nonlinear models is one of the most important methods in time series analysis, which has a wide potential for predicting various phenomena, including physical, engineering and economic, by studying the characteristics of random disturbances in order to arrive at accurate predictions.
In this, the autoregressive model with exogenous variable was built using a threshold as the first method, using two proposed approaches that were used to determine the best cutting point of [the predictability forward (forecasting) and the predictability in the time series (prediction), through the threshold point indicator]. B-J seasonal models are used as a second method based on the principle of the two proposed approaches in dete
... Show MoreThe research discusses the need to find the innovative structures and methodologies for developing Human Capital (HC) in Iraqi Universities. One of the most important of these structures is Communities of Practice (CoPs) which contributes to develop HC by using learning, teaching and training through the conversion speed of knowledge and creativity into practice. This research has been used the comparative approach through employing the methodology of Data Envelopment Analysis (DEA) by using (Excel 2010 - Solver) as a field evidence to prove the role of CoPs in developing HC. In light of the given information, a researcher adopted on an archived preliminary data about (23) colleges at Mosul University as a deliberate sample for t
... Show More