Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is preprocessed and transform of into some intermediate form which can be compressed with better efficiency and security. This solves some problems relevant to the common encryption methods which generally manipulate an entire data set, most encryption algorithms tend to make the transfer of information more costly in terms of time and sometimes bandwidth.
The introduction of the research on the science of training and the physiology of sports was addressed from important sciences, where the physical effort drew the attention of scientists since the past centuries when they studied how the body performs its functions when performing physical exertion and observe the changes that occur in it and write down and study especially the positive effects of the practice of daily sports The aim of the study was to investigate the effect of plank exercises on the lipid component and the metabolic rate (bmr) of the female students of the Higher Institute for Security and Management Development. As for the third chapter, the two researchers used the experimental method on a sample of the female s
... Show MoreThis article discusses the estimation methods for parameters of a generalized inverted exponential distribution with different estimation methods by using Progressive type-I interval censored data. In addition to conventional maximum likelihood estimation, the mid-point method, probability plot method and method of moments are suggested for parameter estimation. To get maximum likelihood estimates, we utilize the Newton-Raphson, expectation -maximization and stochastic expectation-maximization methods. Furthermore, the approximate confidence intervals for the parameters are obtained via the inverse of the observed information matrix. The Monte Carlo simulations are used to introduce numerical comparisons of the proposed estimators. In ad
... Show MoreThe present research deals with the influencing factors which depends on the way perceptual of the graphic designer which enters in the design logos of the loco European health, where the search include four chapters, the researcher reviewed in the chapter 0ne the methodical frame of the research ,as reviewed in the second chapter the theoretical frame, and the previous studies which included three sections, the first section included the perceptual understandable and types of it, and the second section included the influencing factors in the designer perceptual ways and its division . While the third section included the perceptual in graphic designer through the percepted shapes and the relation with ground and colors for express the i
... Show MoreE-Health care system is one of the great technology enhancements via using medical devices through sensors worn or implanted in the patient's body. Wireless Body Area Network (WBAN) offers astonishing help through wireless transmission of patient's data using agreed distance in which it keeps patient's status always controlled by regular transmitting of vital data indications to the receiver. Security and privacy is a major concern in terms of data sent from WBAN and biological sensors. Several algorithms have been proposed through many hypotheses in order to find optimum solutions. In this paper, an encrypting algorithm has been proposed via using hyper-chaotic Zhou system where it provides high security, privacy, efficiency and
... Show MoreAttacking a transferred data over a network is frequently happened millions time a day. To address this problem, a secure scheme is proposed which is securing a transferred data over a network. The proposed scheme uses two techniques to guarantee a secure transferring for a message. The message is encrypted as a first step, and then it is hided in a video cover. The proposed encrypting technique is RC4 stream cipher algorithm in order to increase the message's confidentiality, as well as improving the least significant bit embedding algorithm (LSB) by adding an additional layer of security. The improvement of the LSB method comes by replacing the adopted sequential selection by a random selection manner of the frames and the pixels wit
... Show MoreGeneral medical fields and computer science usually conjugate together to produce impressive results in both fields using applications, programs and algorithms provided by Data mining field. The present research's title contains the term hygiene which may be described as the principle of maintaining cleanliness of the external body. Whilst the environmental hygienic hazards can present themselves in various media shapes e.g. air, water, soil…etc. The influence they can exert on our health is very complex and may be modulated by our genetic makeup, psychological factors and by our perceptions of the risks that they present. Our main concern in this research is not to improve general health, rather than to propose a data mining approach
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show More