Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreIn this paper we introduce a new class of degree of best algebraic approximation polynomial Α,, for unbounded functions in weighted space Lp,α(X), 1 ∞ .We shall prove direct and converse theorems for best algebraic approximation in terms modulus of smoothness in weighted space
A non-polynomial spline (NPS) is an approximation method that relies on the triangular and polynomial parts, so the method has infinite derivatives of the triangular part of the NPS to compensate for the loss of smoothness inherited by the polynomial. In this paper, we propose polynomial-free linear and quadratic spline types to solve fuzzy Volterra integral equations (FVIE) of the 2nd kind with the weakly singular kernel (FVIEWSK) and Abel's type kernel. The linear type algorithm gives four parameters to form a linear spline. In comparison, the quadratic type algorithm gives five parameters to create a quadratic spline, which is more of a credit for the exact solution. These algorithms process kernel singularities with a simple techniqu
... Show MoreThe Purpose of this research is a comparison between two types of multivariate GARCH models BEKK and DVECH to forecast using financial time series which are the series of daily Iraqi dinar exchange rate with dollar, the global daily of Oil price with dollar and the global daily of gold price with dollar for the period from 01/01/2014 till 01/01/2016.The estimation, testing and forecasting process has been computed through the program RATS. Three time series have been transferred to the three asset returns to get the Stationarity, some tests were conducted including Ljung- Box, Multivariate Q and Multivariate ARCH to Returns Series and Residuals Series for both models with comparison between the estimation and for
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreIn this research, some robust non-parametric methods were used to estimate the semi-parametric regression model, and then these methods were compared using the MSE comparison criterion, different sample sizes, levels of variance, pollution rates, and three different models were used. These methods are S-LLS S-Estimation -local smoothing, (M-LLS)M- Estimation -local smoothing, (S-NW) S-Estimation-NadaryaWatson Smoothing, and (M-NW) M-Estimation-Nadarya-Watson Smoothing.
The results in the first model proved that the (S-LLS) method was the best in the case of large sample sizes, and small sample sizes showed that the
... Show MoreThis study looks into the many methods that are used in the risk assessment procedure that is used in the construction industry nowadays. As a result of the slow adoption of novel assessment methods, professionals frequently resort to strategies that have previously been validated as being successful. When it comes to risk assessment, having a precise analytical tool that uses the cost of risk as a measurement and draws on the knowledge of professionals could potentially assist bridge the gap between theory and practice. This step will examine relevant literature, sort articles according to their published year, and identify domains and qualities. Consequently, the most significant findings have been presented in a manne
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreAbstract
Travel Time estimation and reliability measurement is an important issues for improving operation efficiency and safety of traffic roads networks. The aim of this research is the estimation of total travel time and distribution analysis for three selected links in Palestine Arterial Street in Baghdad city. Buffer time index results in worse reliability conditions. Link (2) from Bab Al Mutham intersection to Al-Sakara intersection produced a buffer index of about 36% and 26 % for Link (1) Al-Mawall intersection to Bab Al- Mutham intersection and finally for link (3) which presented a 24% buffer index. These illustrated that the reliability get worst for link
... Show More