Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and the second is achieved by applying the "RSA ". CAST-128 utilizes a pair of sub-keys for each round as a quantum of five bits that was utilized as a key of rotation for each round and a quantum of 32 (bits) was utilized as a key of masking into a round . The proposed adaptive 128-bits key can be extracted from the main diagonal of each frame before encryption. RSA is a public-key cryptographic technique which can be known as (asymmetric) cryptography. An asymmetry of a key depends on factoring a product of two big prime values. A comparison was applied on several videos and the results showed that CAST-128 method proved the highest degree of entropy even if the frames have lots of distorted data or unclear image pixels. For example, the entropy value of a sample of a girl video is 2581.921 when using CAST-128, while it is 2271.329 when using the RSA; also the entropy value of a sample of a scooter video is 2569.814 when using the CAST-128, while it is 2282.844 when using RSA.
Information security in data storage and transmission is increasingly important. On the other hand, images are used in many procedures. Therefore, preventing unauthorized access to image data is crucial by encrypting images to protect sensitive data or privacy. The methods and algorithms for masking or encoding images vary from simple spatial-domain methods to frequency-domain methods, which are the most complex and reliable. In this paper, a new cryptographic system based on the random key generator hybridization methodology by taking advantage of the properties of Discrete Cosine Transform (DCT) to generate an indefinite set of random keys and taking advantage of the low-frequency region coefficients after the DCT stage to pass them to
... Show MoreThe aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreIn this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number determines the persistence or extinction of the COVID-19. If , one infected cell will transmit the virus to less than one cell, as a result, the person carrying the Coronavirus will get rid of the disease .If the infected cell will be able to infect all cells that contain ACE receptors. The stochastic model proves that if are sufficiently large then maybe give us ultimate disease extinction although , and this facts also proved by computer simulation.
In this paper, we will illustrate a gamma regression model assuming that the dependent variable (Y) is a gamma distribution and that it's mean ( ) is related through a linear predictor with link function which is identity link function g(μ) = μ. It also contains the shape parameter which is not constant and depends on the linear predictor and with link function which is the log link and we will estimate the parameters of gamma regression by using two estimation methods which are The Maximum Likelihood and the Bayesian and a comparison between these methods by using the standard comparison of average squares of error (MSE), where the two methods were applied to real da
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreThe control charts are one of the scientific technical statistics tools that will be used to control of production and always contained from three lines central line and upper, lower lines to control quality of production and represents set of numbers so finally the operating productivity under control or nor than depending on the actual observations. Some times to calculating the control charts are not accurate and not confirming, therefore the Fuzzy Control Charts are using instead of Process Control Charts so this method is more sensitive, accurate and economically for assisting decision maker to control the operation system as early time. In this project will be used set data fr
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreLinear discriminant analysis and logistic regression are the most widely used in multivariate statistical methods for analysis of data with categorical outcome variables .Both of them are appropriate for the development of linear classification models .linear discriminant analysis has been that the data of explanatory variables must be distributed multivariate normal distribution. While logistic regression no assumptions on the distribution of the explanatory data. Hence ,It is assumed that logistic regression is the more flexible and more robust method in case of violations of these assumptions.
In this paper we have been focus for the comparison between three forms for classification data belongs
... Show MoreNowadays, the advances in information and communication technologies open the wide door to realize the digital world’s dream. Besides, within the clear scientific scope in all fields, especially the medical field, it has become necessary to harness all the scientific capabilities to serve people, especially in medical-related services. The medical images represent the basis of clinical diagnosis and the source of telehealth and teleconsultation processes. The exchange of these images can be subject to several challenges, such as transmission bandwidth, time delivery, fraud, tampering, modifying, privacy, and more. This paper will introduce an algorithm consisting a combination of compression and encryption techniques to meet such chall
... Show MoreThe purpose of this research is to find the estimator of the average proportion of defectives based on attribute samples. That have been curtailed either with rejection of a lot finding the kth defective or with acceptance on finding the kth non defective.
The MLE (Maximum likelihood estimator) is derived. And also the ASN in Single Curtailed Sampling has been derived and we obtain a simplified Formula All the Notations needed are explained.