In this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
This study has contributed to understanding a delayed prey-predator system involving cannibalism. The system is assumed to use the Holling type II functional response to describe the consuming process and incorporates the predator’s refuge against the cannibalism process. The characteristics of the solution are discussed. All potential equilibrium points have been identified. All equilibrium points’ local stability analyses for all time delay values are investigated. The system exhibits a Hopf bifurcation at the coexistence equilibrium, which is further demonstrated. The center manifold and normal form theorems for functional differential equations are then used to establish the direction of Hopf bifurcation and the stability of the per
... Show MoreWith increased climate change pressures likely to influence harmful algal blooms, exposure to microcystin, a known hepatotoxin and a byproduct of cyanobacterial blooms can be a risk factor for NAFLD associated comorbidities. Using both
يھدف البحث الى اجراء تقدير دالة المعولية لتوزيــع ويبل ذي المعلمتين بالطرائـق المعلميــة والمتمثلة بـ (NWLSM,RRXM,RRYM,MOM,MLM (، وكذلك اجراء تقدير لدالة المعولية بالطرائق الالمعلمية والمتمثلة بـ . (EM, PLEM, EKMEM, WEKM, MKMM, WMR, MMO, MMT) وتم استخدام اسلوب المحاكاة لغرض المقارنة باستخدام حجوم عينات مختلفة (20,40,60,80,100) والوصول الى افضل الطرائق في التقدير باالعتماد على المؤشر االحصائي متوسط مربعات الخطا التكاملي (IMSE(، وقد توصل البحث الى
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show MoreIn this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreOpenStreetMap (OSM) represents the most common example of online volunteered mapping applications. Most of these platforms are open source spatial data collected by non-experts volunteers using different data collection methods. OSM project aims to provide a free digital map for all the world. The heterogeneity in data collection methods made OSM project databases accuracy is unreliable and must be dealt with caution for any engineering application. This study aims to assess the horizontal positional accuracy of three spatial data sources are OSM road network database, high-resolution Satellite Image (SI), and high-resolution Aerial Photo (AP) of Baghdad city with respect to an analogue formal road network dataset obtain
... Show MoreThe Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana
... Show MoreIn the current study, haemoglobin analytes dissolved in a special buffer (KH2PO4(1M), K2HPO4(1M)) with pH of 7.4 were used to record absorption spectra measurements with a range of concentrations from (10-8 to 10-9) M and an absorption peak of 440nm using Broadband Cavity Enhanced Absorption Spectroscopy (BBCEAS) which is considered a simple, low cost, and robust setup. The principle work of this technique depends on the multiple reflections between the light source, which is represented by the Light Emitting Diode 3 W, and the detector, which is represented by the Avantes spectrophotomer. The optical cavity includes two high reflectivity ≥99% dielectric mirrors (dia
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThis paper is devoted to compare the performance of non-Bayesian estimators represented by the Maximum likelihood estimator of the scale parameter and reliability function of inverse Rayleigh distribution with Bayesian estimators obtained under two types of loss function specifically; the linear, exponential (LINEX) loss function and Entropy loss function, taking into consideration the informative and non-informative priors. The performance of such estimators assessed on the basis of mean square error (MSE) criterion. The Monte Carlo simulation experiments are conducted in order to obtain the required results.