In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected and independent among the different subjects
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreThis paper deals with constructing a model of fuzzy linear programming with application on fuels product of Dura- refinery , which consist of seven products that have direct effect ondaily consumption . After Building the model which consist of objective function represents the selling prices ofthe products and fuzzy productions constraints and fuzzy demand constraints addition to production requirements constraints , we used program of ( WIN QSB ) to find the optimal solution
Different ANN architectures of MLP have been trained by BP and used to analyze Landsat TM images. Two different approaches have been applied for training: an ordinary approach (for one hidden layer M-H1-L & two hidden layers M-H1-H2-L) and one-against-all strategy (for one hidden layer (M-H1-1)xL, & two hidden layers (M-H1-H2-1)xL). Classification accuracy up to 90% has been achieved using one-against-all strategy with two hidden layers architecture. The performance of one-against-all approach is slightly better than the ordinary approach
Design sampling plan was and still one of most importance subjects because it give lowest cost comparing with others, time live statistical distribution should be known to give best estimators for parameters of sampling plan and get best sampling plan.
Research dell with design sampling plan when live time distribution follow Logistic distribution with () as location and shape parameters, using these information can help us getting (number of groups, sample size) associated with reject or accept the Lot
Experimental results for simulated data shows the least number of groups and sample size needs to reject or accept the Lot with certain probability of
... Show MoreThis paper is concerned with finding solutions to free-boundary inverse coefficient problems. Mathematically, we handle a one-dimensional non-homogeneous heat equation subject to initial and boundary conditions as well as non-localized integral observations of zeroth and first-order heat momentum. The direct problem is solved for the temperature distribution and the non-localized integral measurements using the Crank–Nicolson finite difference method. The inverse problem is solved by simultaneously finding the temperature distribution, the time-dependent free-boundary function indicating the location of the moving interface, and the time-wise thermal diffusivity or advection velocities. We reformulate the inverse problem as a non-
... Show MoreHypercholesterolemia is a predominant risk factor for atherosclerosis and cardiovascular disease (CVD). The World Health Organization (WHO), ) recommended reducing the intake of cholesterol and saturated fats. On the other hand, limited evidence is available on the benefits of vegetables in the diet to reduce these risk factors, so this research was conducted to compare the hypolipidemic effect between the extracts of two different types of Iraqi peppers, the fruit of the genus Capsicum traditionally known as red pepper extract (RPE), and Piper nigrum as black pepper extract (BPE), respectively, in different parameters and histology of the liver of the experimental animals. The red pepper was extracted by ethyl acetate, while the black pepp
... Show MoreStructural buildings consist of concrete and steel, and these buildings have confronted many challenges from various aggressive environments against the materials manufactured from them. It contains high water levels and buildings whose concrete cover may be damaged and thus lead to the deterioration and corrosion of steel. It was important to have an alternative to steel, such as the glass fiber reinforced polymer (GFRP), which is distinguished by its great effectiveness in resisting corrosion, as well as its strong tensile resistance. Still, one of its drawbacks is that it has a low modulus of elasticity. This research article aims to conduct a numerical study using the nonlinear fi
Lowpass spatial filters are adopted to match the noise statistics of the degradation seeking
good quality smoothed images. This study imply different size and shape of smoothing
windows. The study shows that using a window square frame shape gives good quality
smoothing and at the same time preserving a certain level of high frequency components in
comparsion with standard smoothing filters.
Many of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show More