Many of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the remaining constant parameters and time-varying parameters are estimated by using a semi-parametric regression model and then comparing this method with methods based on numerical discretization methods, which includes two stages. In the first stage we estimate the state variables and their derivatives by (p spline) , In the second stage we use Methods of numerical discretization methods (the Euler discretization method and the trapezoidal discretization method), where the comparison was done using simulations and showed the results superior to the trapezoidal method of numerical differentiation where it gave the best estimations to balance between accuracy in estimation And high arithmetic cost.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe planning for the formation of administrative policies and guidance through leadership are important things for managing administrative processes and sporting activities. As both contribute in the stability of the administrative conditions, and their development in the sport federations, whether they both were attentive about team and individual Olympic Games. The two researchers observe that, there is a variation in the correct way of application. Particularly in the formulation of administrative policies and leadership describing it as, modern management standards for both team and individual Olympic Games in the Iraqi National Olympic Committee. That led to cause a misconception and lack of clarity for some administrators of those uni
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Time series is an important statistical method adopted in the analysis of phenomena, practices, and events in all areas during specific time periods and predict future values contribute to give a rough estimate of the status of the study, so the study aimed to adopt the ARIMA models to forecast the volume of cargo handled and achieved in four ports (Umm Qasr Port, Khor Al Zubair Port, Abu Flus Port, and Maqal Port(, Monthly data on the volume of cargo handled for the years (2006-2018) were collected (156) observations. The study found that the most efficient model is ARIMA (1,1,1).
The volume of go
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreThis paper demonstrates an experimental and numerical study aimed at comparing the influence of openings of different configurations on the flexural behavior of reinforced concrete gable roof beams. The experimental program consisted of testing six simply supported gable beams subjected to mid-point concentrated load. The variable which has been investigated in this work was opening's configuration (quadrilateral or circular) with the same upper and lower chords depth. The results indicate improvement in the beams’ flexural behavior when circular openings were used compared with that of quadrilateral openings, represented by an increase in ultimate load capacity and a decrease in deflection at the service limit. Also, there was an
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show MoreIn this paper we present a method to analyze five types with fifteen wavelet families for eighteen different EMG signals. A comparison study is also given to show performance of various families after modifying the results with back propagation Neural Network. This is actually will help the researchers with the first step of EMG analysis. Huge sets of results (more than 100 sets) are proposed and then classified to be discussed and reach the final.