A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
Survival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreAgile methodologies are adopted extensively by many of the software industries as it is flexible in nature as well as can address the required changes in any phase of development. Authentic estimation of the software products is not an easy task as it requires continuous attention of the product owner. Effort and cost can be estimated in a proper manner to ensure the success of the project. In this article, we considered the Scrum-based Agile projects that are developed into several Sprints. We proposed an extension to an existing algorithm, based on a total of 36 success factors; that estimate the development cost and effort required to complete the project. For estimation and computations, we have taken a dataset of 12 project
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreAbstract—The upper limb amputation exerts a significant burden on the amputee, limiting their ability to perform everyday activities, and degrading their quality of life. Amputee patients’ quality of life can be improved if they have natural control over their prosthetic hands. Among the biological signals, most commonly used to predict upper limb motor intentions, surface electromyography (sEMG), and axial acceleration sensor signals are essential components of shoulder-level upper limb prosthetic hand control systems. In this work, a pattern recognition system is proposed to create a plan for categorizing high-level upper limb prostheses in seven various types of shoulder girdle motions. Thus, combining seven feature groups, w
... Show MoreIn this paper, suggested formula as well a conventional method for estimating the twoparameters (shape and scale) of the Generalized Rayleigh Distribution was proposed. For different sample sizes (small, medium, and large) and assumed several contrasts for the two parameters a percentile estimator was been used. Mean Square Error was implemented as an indicator of performance and comparisons of the performance have been carried out through data analysis and computer simulation between the suggested formulas versus the studied formula according to the applied indicator. It was observed from the results that the suggested method which was performed for the first time (as far as we know), had highly advantage than t
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show MoreIn this paper, suggested method as well as the conventional methods (probability
plot-(p.p.) for estimations of the two-parameters (shape and scale) of the Weibull
distribution had proposed and the estimators had been implemented for different
sample sizes small, medium, and large of size 20, 50, and 100 respectively by
simulation technique. The comparisons were carried out between different methods
and sample sizes. It was observed from the results that suggested method which
were performed for the first time (as far as we know), by using MSE indicator, the
comparisons between the studied and suggested methods can be summarized
through extremely asymptotic for indicator (MSE) results by generating random
error
Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of oth
... Show MoreEach phenomenon contains several variables. Studying these variables, we find mathematical formula to get the joint distribution and the copula that are a useful and good tool to find the amount of correlation, where the survival function was used to measure the relationship of age with the level of cretonne in the remaining blood of the person. The Spss program was also used to extract the influencing variables from a group of variables using factor analysis and then using the Clayton copula function that is used to find the shared binary distributions using multivariate distributions, where the bivariate distribution was calculated, and then the survival function value was calculated for a sample size (50) drawn from Yarmouk Ho
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show More