Cloud Computing is a mass platform to serve high volume data from multi-devices and numerous technologies. Cloud tenants have a high demand to access their data faster without any disruptions. Therefore, cloud providers are struggling to ensure every individual data is secured and always accessible. Hence, an appropriate replication strategy capable of selecting essential data is required in cloud replication environments as the solution. This paper proposed a Crucial File Selection Strategy (CFSS) to address poor response time in a cloud replication environment. A cloud simulator called CloudSim is used to conduct the necessary experiments, and results are presented to evidence the enhancement on replication performance. The obtained analytical graphs are discussed thoroughly, and apparently, the proposed CFSS algorithm outperformed another existing algorithm with a 10.47% improvement in average response time for multiple jobs per round.
As cities across the world grow and the mobility of populations increases, there has also been a corresponding increase in the number of vehicles on roads. The result of this has been a proliferation of challenges for authorities with regard to road traffic management. A consequence of this has been congestion of traffic, more accidents, and pollution. Accidents are a still major cause of death, despite the development of sophisticated systems for traffic management and other technologies linked with vehicles. Hence, it is necessary that a common system for accident management is developed. For instance, traffic congestion in most urban areas can be alleviated by the real-time planning of routes. However, the designing of an efficie
... Show MoreCyber security is a term utilized for describing a collection of technologies, procedures, and practices that try protecting an online environment of a user or an organization. For medical images among most important and delicate data kinds in computer systems, the medical reasons require that all patient data, including images, be encrypted before being transferred over computer networks by healthcare companies. This paper presents a new direction of the encryption method research by encrypting the image based on the domain of the feature extracted to generate a key for the encryption process. The encryption process is started by applying edges detection. After dividing the bits of the edge image into (3×3) windows, the diffusions
... Show MoreCloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreThis paper presents a fully computerized method to backup the router configuration file. The method consists of a friendly graphical interface programmed by Java programming language.
The proposed method is compared with the two existing methods, namely: TFTP server method and Copy/Paste method. The comparison reveals that the proposed method has many advantages over the existing ones. The proposed method has been implemented on Cisco routers (series 2500, 2600 and 2800).
Abstract
The aim of this research is to concentrate on the of knowledge management activities, initial activities: (Acquisition, Selection, Generation, Assimilation, Emission) knowledge, and support activities: (Measurement, Control, Coordination, Leadership) that is manipulate and controlling in achieving knowledge management cases in organization, that’s is leads to knowledge chain model, then determining the level of membership for these activities to knowledge chain model in a sample of Iraqi organization pushed by knowledge (Universities). The research depends on check list for gaining the data required, theses check list designed by apparently in diagnosing research dimensions and measurem
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system
... Show MoreOver last decade, rapid growth in economic and population accompanied with depletion of the energy resources lead to serious impacts on environment and humanity. This development coupled with active constructions, which in some examples ignore the impact on the environment and human activities. Therefore, principle of sustainability has required in order to reducing this negative impact on the environment and the humanity.In developing countries, it seems that there is a huge gap between the current construction practices and sustainable principle, which need more attention to clarify and define the problems in order to find suitable solutions before it comes more difficult and expensive. The study aims to choose one of the develo