Cloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many applications, and tools that can be used for providing the integrity and security to the data located in the cloud environment.
The study discusses ''The Security Intellectual Proposals of the Paris and the Welsh Schools'', which are considered one of the most important contemporary European monetary schools that emerged in the nineties of the twentieth century, and how did it approach the concept of security, criticizing the traditional trend that prevailed during the Cold War period regarding limiting the concept of security to the state or to the military aspect (National Security), and an attempt to expand the concept to economic, social and environmental dimensions, as well as political and military dimensions. The most important proposals that the Wales School provided are “Security as an emancipation policy”, “ individual security”, and “The ro
... Show MoreWeb application protection lies on two levels: the first is the responsibility of the server management, and the second is the responsibility of the programmer of the site (this is the scope of the research). This research suggests developing a secure web application site based on three-tier architecture (client, server, and database). The security of this system described as follows: using multilevel access by authorization, which means allowing access to pages depending on authorized level; password encrypted using Message Digest Five (MD5) and salt. Secure Socket Layer (SSL) protocol authentication used. Writing PHP code according to set of rules to hide source code to ensure that it cannot be stolen, verification of input before it is s
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn the current Windows version (Vista), as in all previous versions, creating a user account without setting a password is possible. For a personal PC this might be without too much risk, although it is not recommended, even by Microsoft itself. However, for business computers it is necessary to restrict access to the computers, starting with defining a different password for every user account. For the earlier versions of Windows, a lot of resources can be found giving advice how to construct passwords of user accounts. In some extent they contain remarks concerning the suitability of their solution for Windows Vista. But all these resources are not very precise about what kind of passwords the user must use. To assess the protection of pa
... Show MoreThe concealment of data has emerged as an area of deep and wide interest in research that endeavours to conceal data in a covert and stealth manner, to avoid detection through the embedment of the secret data into cover images that appear inconspicuous. These cover images may be in the format of images or videos used for concealment of the messages, yet still retaining the quality visually. Over the past ten years, there have been numerous researches on varying steganographic methods related to images, that emphasised on payload and the quality of the image. Nevertheless, a compromise exists between the two indicators and to mediate a more favourable reconciliation for this duo is a daunting and problematic task. Additionally, the current
... Show MoreThe study aims to provide a Suggested model for the application of Virtual Private Network is a tool that used to protect the transmitted data through the Web-based information system, and the research included using case study methodology in order to collect the data about the research area ( Al-Rasheed Bank) by using Visio to design and draw the diagrams of the suggested models and adopting the data that have been collected by the interviews with the bank's employees, and the research used the modulation of data in order to find solutions for the research's problem.
The importance of the study Lies in dealing with one of the vital topics at the moment, namely, how to make the information transmitted via
... Show MoreThe electrical activity of the heart and the electrocardiogram (ECG) signal are fundamentally related. In the study that has been published, the ECG signal has been examined and used for a number of applications. The monitoring of heart rate and the analysis of heart rhythm patterns, the detection and diagnosis of cardiac diseases, the identification of emotional states, and the use of biometric identification methods are a few examples of applications in the field. Several various phases may be involved in the analysis of electrocardiogram (ECG) data, depending on the type of study being done. Preprocessing, feature extraction, feature selection, feature modification, and classification are frequently included in these stages. Ever
... Show MoreEnvironmental pollution is regarded as a major problem, and traditional strategies such as chemical or physical remediation are not sufficient to overcome the problems of pollution. Petroleum-contaminated soil results in ecological problems, representing a danger to human health. Bioremediation has received remarkable attention, and it is a procedure that uses a biological agent to remove toxic waste from contaminated soil. This approach is easy to handle, inexpensive, and environmentally friendly; its results are highly satisfactory. Bioremediation is a biodegradation process in which the organic contaminants are completely mineralized to inorganic compounds, carbon dioxide, and water. This review discusses the bioremediation of petroleum-
... Show MoreA procedure for the mutual derivatization and determination of thymol and Dapsone was developed and validated in this study. Dapsone was used as the derivatizing agent for the determination of thymol, and thymol was used as the derivatizing agent for the determination of Dapsone. An optimization study was performed for the derivatization reaction; i.e., the diazonium coupling reaction. Linear regression calibration plots for thymol and Dapsone in the direct reaction were constructed at 460 nm, within the concentration range of 0.3-7 μg ml-1 for thymol and 0.3-4 μg ml-1 for Dapsone, with limits of detection 0.086 and 0.053 μg ml-1, respectively. Corresponding plots for the cloud point extraction of thymol and Dapsone were constructed
... Show More