A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
Cloth simulation and animation has been the topic of research since the mid-80's in the field of computer graphics. Enforcing incompressible is very important in real time simulation. Although, there are great achievements in this regard, it still suffers from unnecessary time consumption in certain steps that is common in real time applications. This research develops a real-time cloth simulator for a virtual human character (VHC) with wearable clothing. This research achieves success in cloth simulation on the VHC through enhancing the position-based dynamics (PBD) framework by computing a series of positional constraints which implement constant densities. Also, the self-collision and collision wit
... Show MoreAs they are the smallest functional parts of the muscle, motor units (MUs) are considered as the basic building blocks of the neuromuscular system. Monitoring MU recruitment, de-recruitment, and firing rate (by either invasive or surface techniques) leads to the understanding of motor control strategies and of their pathological alterations. EMG signal decomposition is the process of identification and classification of individual motor unit action potentials (MUAPs) in the interference pattern detected with either intramuscular or surface electrodes. Signal processing techniques were used in EMG signal decomposition to understand fundamental and physiological issues. Many techniques have been developed to decompose intramuscularly detec
... Show MoreThis paper designed a fault tolerance for soft real time distributed system (FTRTDS). This system is designed to be independently on specific mechanisms and facilities of the underlying real time distributed system. It is designed to be distributed on all the computers in the distributed system and controlled by a central unit.
Besides gathering information about a target program spontaneously, it provides information about the target operating system and the target hardware in order to diagnose the fault before occurring, so it can handle the situation before it comes on. And it provides a distributed system with the reactive capability of reconfiguring and reinitializing after the occurrence of a failure.
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Abstract
In this research we study the wavelet characteristics for the important time series known as Sunspot, on the aim of verifying the periodogram that other researchers had reached by the spectral transform, and noticing the variation in the period length on one side and the shifting on another.
A continuous wavelet analysis is done for this series and the periodogram in it is marked primarily. for more accuracy, the series is partitioned to its the approximate and the details components to five levels, filtering these components by using fixed threshold on one time and independent threshold on another, finding the noise series which represents the difference between
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
In this paper, Bayes estimators for the shape and scale parameters of Weibull distribution have been obtained using the generalized weighted loss function, based on Exponential priors. Lindley’s approximation has been used effectively in Bayesian estimation. Based on theMonte Carlo simulation method, those estimators are compared depending on the mean squared errors (MSE’s).
Abstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show More