Portable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fundamental security functions demanded by organizations and whether these functions have been securely developed. Therefore, this paper proposes a security framework for mobile data that combines core security mechanisms to avoid these problems and protects sensitive information without spending time and money deploying several new applications.
The influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreThe cross section evaluation for (α,n) reaction was calculated according to the available International Atomic Energy Agency (IAEA) and other experimental published data . These cross section are the most recent data , while the well known international libraries like ENDF , JENDL , JEFF , etc. We considered an energy range from threshold to 25 M eV in interval (1 MeV). The average weighted cross sections for all available experimental and theoretical(JENDL) data and for all the considered isotopes was calculated . The cross section of the element is then calculated according to the cross sections of the isotopes of that element taking into account their abundance . A mathematical representative equation for each of the element
... Show MoreAmplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreThe aim of this research is to know how business organizations achieve competitive advantage ,and make it sustainable through constructing a green strategy ( friend to environment) which is reflected on sustaining their competitive advantages .The problem of this study is presented through trying to answer many thoughtful questions, the most important of them are:
1-Can business organizations today make green strategies supporting their competitive advantage?
2-Is there a framework or mechanism could be depended on by business organizations to manage strategic risks of losing their competit
... Show Morefacing economic units operating in the environment sector of the Iraqi
industrial many pressures in its seeking to measure and evaluate its performance because of variables, today's corporate environment, as the case which makes looking for a methodology can be adopted to evaluate its performance with a more holistic, rather than being limited to traditional measures that are no longer enough to keep pace with rapid changes in today's corporate environment, which requires that measures of performance are derived from the strategy of unity and commensurate with the specificity of the environment in Iraq. Try searching discussion Ttormwhrat and performance measurement systems to suit the business strategies and directions of change
... Show MoreThe Diffie-Hellman is a key exchange protocol to provide a way to transfer shared secret keys between two parties, although those parties might never have communicated together. This paper suggested a new way to transfer keys through public or non-secure channels depending on the sent video files over the channel and then extract keys. The proposed method of key generation depends on the video file content by using the entropy value of the video frames. The proposed system solves the weaknesses in the Diffie-Hellman key exchange algorithm, which is MIMA (Man-in-the-Middle attack) and DLA( Discrete logarithm attack). When the method used high definition videos with a vast amount of data, the keys generated with a large number up to 5
... Show MoreThis paper introduces an innovative method for image encryption called "Two-Fold Cryptography," which leverages the Henon map in a dual-layer encryption framework. By applying two distinct encryption processes, this approach offers enhanced security for images. Key parameters generated by the Henon map dynamically shape both stages of encryption, creating a sophisticated and robust security system. The findings reveal that Two-Fold Cryptography provides a notable improvement in image protection, outperforming traditional single-layer encryption techniques.
With the increasing rate of unauthorized access and attacks, security of confidential data is of utmost importance. While Cryptography only encrypts the data, but as the communication takes place in presence of third parties, so the encrypted text can be decrypted and can easily be destroyed. Steganography, on the other hand, hides the confidential data in some cover source such that the existence of the data is also hidden which do not arouse suspicion regarding the communication taking place between two parties. This paper presents to provide the transfer of secret data embedded into master file (cover-image) to obtain new image (stego-image), which is practically indistinguishable from the original image, so that other than the indeed us
... Show MoreCloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many
... Show More