Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and the second is achieved by applying the "RSA ". CAST-128 utilizes a pair of sub-keys for each round as a quantum of five bits that was utilized as a key of rotation for each round and a quantum of 32 (bits) was utilized as a key of masking into a round . The proposed adaptive 128-bits key can be extracted from the main diagonal of each frame before encryption. RSA is a public-key cryptographic technique which can be known as (asymmetric) cryptography. An asymmetry of a key depends on factoring a product of two big prime values. A comparison was applied on several videos and the results showed that CAST-128 method proved the highest degree of entropy even if the frames have lots of distorted data or unclear image pixels. For example, the entropy value of a sample of a girl video is 2581.921 when using CAST-128, while it is 2271.329 when using the RSA; also the entropy value of a sample of a scooter video is 2569.814 when using the CAST-128, while it is 2282.844 when using RSA.
Classifying an overlapping object is one of the main challenges faced by researchers who work in object detection and recognition. Most of the available algorithms that have been developed are only able to classify or recognize objects which are either individually separated from each other or a single object in a scene(s), but not overlapping kitchen utensil objects. In this project, Faster R-CNN and YOLOv5 algorithms were proposed to detect and classify an overlapping object in a kitchen area. The YOLOv5 and Faster R-CNN were applied to overlapping objects where the filter or kernel that are expected to be able to separate the overlapping object in the dedicated layer of applying models. A kitchen utensil benchmark image database and
... Show MoreThis paper deals with defining Burr-XII, and how to obtain its p.d.f., and CDF, since this distribution is one of failure distribution which is compound distribution from two failure models which are Gamma model and weibull model. Some equipment may have many important parts and the probability distributions representing which may be of different types, so found that Burr by its different compound formulas is the best model to be studied, and estimated its parameter to compute the mean time to failure rate. Here Burr-XII rather than other models is consider because it is used to model a wide variety of phenomena including crop prices, household income, option market price distributions, risk and travel time. It has two shape-parame
... Show MoreQuantum channels enable the achievement of communication tasks inaccessible to their
classical counterparts. The most famous example is the distribution of secret keys. Unfortunately, the rate
of generation of the secret key by direct transmission is fundamentally limited by the distance. This limit
can be overcome by the implementation of a quantum repeater. In order to boost the performance of the
repeater, a quantum repeater based on cut-off with two different types of quantum memories is suggestd,
which reduces the effect of decoherence during the storage of a quantum state.
A condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
Diabetes mellitus caused by insulin resistance is prompted by obesity. Neuropeptide Nesfatin-1 was identified in several organs, including the central nervous system and pancreatic islet cells. Nesfatin-1 peptide appears to be involved in hypothalamic circuits that energy homeostasis and control food intake. Adiponectin is a plasma collagen-like protein produced by adipocytes that have been linked to the development of insulin resistance (IR), diabetes mellitus type 2 (DMT2), and cardiovascular disease (CVD). Resistin was first identified as an adipose tissue–specific hormone that was linked to obesity and diabetes. The aim of this study was to estimate the relationship between human serum nesfatin-1, adiponect
... Show MoreWeibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreIn this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreThe question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreThe problem of the study and its significance:
Due to the increasing pressures of life continually, and constant quest behind materialism necessary and frustrations that confront us daily in general, the greater the emergence of a number of cases of disease organic roots psychological causing them because of severity of a lack of response to conventional treatments (drugs), and this is creating in patients a number of emotional disorders resulting from concern the risk of disease
That is interested psychologists and doctors searchin
... Show More
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show More