Internet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR) algorithm was used, which represents a load balancing (LB) algorithm that is applied to schedule and distributes data among fog servers by reading CPU and memory values of these servers in order to improve system performance. The results proved that DWRR algorithm provides high throughput which reaches 3290 req/sec at 919 users. A lot of research is concerned with distribution of workload by using LB techniques without paying much attention to Fault Tolerance (FT), which implies that the system continues to operate even when fault occurs. Therefore, we proposed a replication FT technique called primary-backup replication based on dynamic checkpoint interval on FC. Checkpoint was used to replicate new data from a primary server to a backup server dynamically by monitoring CPU values of primary fog server, so that checkpoint occurs only when the CPU value is larger than 0.2 to reduce overhead. The results showed that the execution time of data filtering process on the FC with a dynamic checkpoint is less than the time spent in the case of the static checkpoint that is independent on the CPU status.
Delays occur commonly in construction projects. Assessing the impact of delay is sometimes a contentious
issue. Several delay analysis methods are available but no one method can be universally used over another in
all situations. The selection of the proper analysis method depends upon a variety of factors including
information available, time of analysis, capabilities of the methodology, and time, funds and effort allocated to the analysis. This paper presents computerized schedule analysis programmed that use daily windows analysis method as it recognized one of the most credible methods, and it is one of the few techniques much more likely to be accepted by courts than any other method. A simple case study has been implement
In this paper, ARIMA model was used for Estimating the missing data(air temperature, relative humidity, wind speed) for mean monthly variables in different time series at three stations (Sinjar, Baghdad , AL.Hai) which represented different parts of Iraq from north to south respectively
Concrete structures are exposed to aggressive environmental conditions that lead to corrosion of the embedded reinforcement and pre-stressing steel. Consequently, the safety of concrete structures may be compromised, and this requires a significant budgets to repair and maintain critical infrastructure. Prediction of structural safety can lead to significant reductions in maintenance costs by maximizing the impact of investments. The aim of this paper is to establish a framework to assess the reliability of existing post-tensioned concrete bridges. A time-dependent reliability analysis of an existing post-tensioned involving the assessment of Ynys-y-Gwas bridge has been presented in this study. The main cause of failure of this bridge was c
... Show MoreThe increase in cloud computing services and the large-scale construction of data centers led to excessive power consumption. Datacenters contain a large number of servers where the major power consumption takes place. An efficient virtual machine placement algorithm is substantial to attain energy consumption minimization and improve resource utilization through reducing the number of operating servers. In this paper, an enhanced discrete particle swarm optimization (EDPSO) is proposed. The enhancement of the discrete PSO algorithm is achieved through modifying the velocity update equation to bound the resultant particles and ensuring feasibility. Furthermore, EDPSO is assisted by two heuristic algorithms random first fit (RFF) a
... Show MoreThe concept of implementing e-government systems is growing widely all around the world and becoming an interest to all governments. However, governments are still seeking for effective ways to implement e-government systems properly and successfully. As services of e-government increased and citizens’ demands expand, the e-government systems become more costly to satisfy the growing needs. The cloud computing is a technique that has been discussed lately as a solution to overcome some problems that an e-government implementation or expansion is going through. This paper is a proposal of a new model for e-government on basis of cloud computing. E-Government Public Cloud Model EGPCM, for e-government is related t
... Show MoreThis article deals with estimations of system Reliability for one component, two and s-out-of-k stress-strength system models with non-identical component strengths which are subjected to a common stress, using Exponentiated Exponential distribution with common scale parameter. Based on simulation, comparison studies are made between the ML, PC and LS estimators of these system reliabilities when scale parameter is known.
Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreIn Automatic Speech Recognition (ASR) the non-linear data projection provided by a one hidden layer Multilayer Perceptron (MLP), trained to recognize phonemes, and has previous experiments to provide feature enhancement substantially increased ASR performance, especially in noise. Previous attempts to apply an analogous approach to speaker identification have not succeeded in improving performance, except by combining MLP processed features with other features. We present test results for the TIMIT database which show that the advantage of MLP preprocessing for open set speaker identification increases with the number of speakers used to train the MLP and that improved identification is obtained as this number increases beyond sixty.
... Show MoreIn the current Windows version (Vista), as in all previous versions, creating a user account without setting a password is possible. For a personal PC this might be without too much risk, although it is not recommended, even by Microsoft itself. However, for business computers it is necessary to restrict access to the computers, starting with defining a different password for every user account. For the earlier versions of Windows, a lot of resources can be found giving advice how to construct passwords of user accounts. In some extent they contain remarks concerning the suitability of their solution for Windows Vista. But all these resources are not very precise about what kind of passwords the user must use. To assess the protection of pa
... Show More