Many of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the remaining constant parameters and time-varying parameters are estimated by using a semi-parametric regression model and then comparing this method with methods based on numerical discretization methods, which includes two stages. In the first stage we estimate the state variables and their derivatives by (p spline) , In the second stage we use Methods of numerical discretization methods (the Euler discretization method and the trapezoidal discretization method), where the comparison was done using simulations and showed the results superior to the trapezoidal method of numerical differentiation where it gave the best estimations to balance between accuracy in estimation And high arithmetic cost.
Is in this research review of the way minimum absolute deviations values based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.
Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe finite element method has been used in this paper to investigate the behavior of precast reinforced concrete dapped-ends beams (DEBs) numerically. A parametric investigation was performed on an experimental specimen tested by a previous researcher to show the effect of numerous parameters on the strength and behavior of RC dapped-end beams. Reinforcement details and steel arrangement, the influence of concrete compressive strength, the effect of inclined load, and the effect of support settlement on the strength of dapped-ends beams are examples of such parameters. The results revealed that the dapped-end reinforcement arrangement greatly affects the behavior of dapped end beam. The failure load decreases by 25% when
... Show MoreThe Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreTime series is an important statistical method adopted in the analysis of phenomena, practices, and events in all areas during specific time periods and predict future values contribute to give a rough estimate of the status of the study, so the study aimed to adopt the ARIMA models to forecast the volume of cargo handled and achieved in four ports (Umm Qasr Port, Khor Al Zubair Port, Abu Flus Port, and Maqal Port(, Monthly data on the volume of cargo handled for the years (2006-2018) were collected (156) observations. The study found that the most efficient model is ARIMA (1,1,1).
The volume of go
... Show MoreThe planning for the formation of administrative policies and guidance through leadership are important things for managing administrative processes and sporting activities. As both contribute in the stability of the administrative conditions, and their development in the sport federations, whether they both were attentive about team and individual Olympic Games. The two researchers observe that, there is a variation in the correct way of application. Particularly in the formulation of administrative policies and leadership describing it as, modern management standards for both team and individual Olympic Games in the Iraqi National Olympic Committee. That led to cause a misconception and lack of clarity for some administrators of those uni
... Show MoreThis paper aims at presenting a comparison between objective and subjective tests . This paper attemptsto shed light on these two aspects of tests and make do a compression by using suitable techniques for objective and subjective tests .
The paper compares between the two techniques used by the objective and subjective tests respectively, the time and efforts required by each type, the extent to which each type can be reliable, and the skills each type is suitable to measure.
The paper shows that objective tests, on the contrary of the subjective ones, encourages guess> Objective tests are used to test specific areas of langua
... Show MoreA simplified theoretical comparison of the hydrogen chloride (HCl) and hydrogen fluoride (HF) chemical lasers is presented by using computer program. The program is able to predict quantitative variations of the laser characteristics as a function of rotational and vibrational quantum number. Lasing is assumed to occur in a Fabry-Perot cavity on vibration-rotation transitions between two vibrational levels of hypothetical diatomic molecule. This study include a comprehensive parametric analysis that indicates that the large rotational constant of HF laser in comparison with HCl laser makes it relatively easy to satisfy the partial inversion criterion. The results of this computer program proved their credibility when compared with th
... Show MoreIn this paper ,the problem of point estimation for the two parameters of logistic distribution has been investigated using simulation technique. The rank sampling set estimator method which is one of the Non_Baysian procedure and Lindley approximation estimator method which is one of the Baysian method were used to estimate the parameters of logistic distribution. Comparing between these two mentioned methods by employing mean square error measure and mean absolute percentage error measure .At last simulation technique used to generate many number of samples sizes to compare between these methods.
This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show More