In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this paper, integrated quantum neural network (QNN), which is a class of feedforward
neural networks (FFNN’s), is performed through emerging quantum computing (QC) with artificial neural network(ANN) classifier. It is used in data classification technique, and here iris flower data is used as a classification signals. For this purpose independent component analysis (ICA) is used as a feature extraction technique after normalization of these signals, the architecture of (QNN’s) has inherently built in fuzzy, hidden units of these networks (QNN’s) to develop quantized representations of sample information provided by the training data set in various graded levels of certainty. Experimental results presented here show that
... Show MoreAbstract:
This research aims to compare Bayesian Method and Full Maximum Likelihood to estimate hierarchical Poisson regression model.
The comparison was done by simulation using different sample sizes (n = 30, 60, 120) and different Frequencies (r = 1000, 5000) for the experiments as was the adoption of the Mean Square Error to compare the preference estimation methods and then choose the best way to appreciate model and concluded that hierarchical Poisson regression model that has been appreciated Full Maximum Likelihood Full Maximum Likelihood with sample size (n = 30) is the best to represent the maternal mortality data after it has been reliance value param
... Show Morethis research aims it measure the technical efficiency of the branches of the General Company for Land Transport, That scattered geographically at country level, by Data Envelopment analysis (DEA) technique, as this technique relies on measuring the efficiency of a set of asymmetric Decision making units, which is one of the nonparametric mathematical methods for and application related to Linear Programming, and this is what helps the General Company for Land Transport to diagnose its branches performance by benchmarking with each other and determine the performance gap. The research found that there is variation in the level of efficiency in the company's branches
The problem of Multicollinearity is one of the most common problems, which deal to a large extent with the internal correlation between explanatory variables. This problem is especially Appear in economics and applied research, The problem of Multicollinearity has a negative effect on the regression model, such as oversized variance degree and estimation of parameters that are unstable when we use the Least Square Method ( OLS), Therefore, other methods were used to estimate the parameters of the negative binomial model, including the estimated Ridge Regression Method and the Liu type estimator, The negative binomial regression model is a nonline
... Show MoreAims: This study aims to compare patients’ complaints and problems of wearing complete dentures.
Methodology: The sample included 40 Iraqi patients who are wearing complete dentures from about five years ago. They
were selected randomly with a age range between (55–65) years. The questions asked to the patients were listed according
to the recent classification of post-insertion problems.
Result: The results showed that the percentage of patient's complaint from adaptation problems (62.1%) was higher than
looseness problems (61.3%) and discomfort problems (39.3%) as followed.
Recommendation: Dentists need thorough knowledge of anatomy, physiology, pathology and psychology. The assessing
of the psyche and emotions
The logistic regression model is an important statistical model showing the relationship between the binary variable and the explanatory variables. The large number of explanations that are usually used to illustrate the response led to the emergence of the problem of linear multiplicity between the explanatory variables that make estimating the parameters of the model not accurate.
... Show MoreTwo simple methods spectrophotometric were suggested for the determination of Cefixime (CFX) in pure form and pharmaceutical preparation. The first method is based without cloud point (CPE) on diazotization of the Cefixime drug by sodium nitrite at 5Cº followed by coupling with ortho nitro phenol in basic medium to form orange colour. The product was stabilized and measured 400 nm. Beer’s law was obeyed in the concentration range of (10-160) μg∙mL-1 Sandell’s sensitivity was 0.0888μg∙cm-1, the detection limit was 0.07896μg∙mL-1, and the limit of Quantitation was 0.085389μg∙mL-1.The second method was cloud point extraction (CPE) with using Trtion X-114 as surfactant. Beer
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.