In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Circular data (circular sightings) are periodic data and are measured on the unit's circle by radian or grades. They are fundamentally different from those linear data compatible with the mathematical representation of the usual linear regression model due to their cyclical nature. Circular data originate in a wide variety of fields of scientific, medical, economic and social life. One of the most important statistical methods that represents this data, and there are several methods of estimating angular regression, including teachers and non-educationalists, so the letter included the use of three models of angular regression, two of which are teaching models and one of which is a model of educators. ) (DM) (MLE) and circular shrinkage mod
... Show MoreOptimization is essentially the art, science and mathematics of choosing the best among a given set of finite or infinite alternatives. Though currently optimization is an interdisciplinary subject cutting through the boundaries of mathematics, economics, engineering, natural sciences, and many other fields of human Endeavour it had its root in antiquity. In modern day language the problem mathematically is as follows - Among all closed curves of a given length find the one that closes maximum area. This is called the Isoperimetric problem. This problem is now mentioned in a regular fashion in any course in the Calculus of Variations. However, most problems of antiquity came from geometry and since there were no general methods to solve suc
... Show MoreTheresearch took the spatial autoregressive model: SAR and spatial error model: SEM in an attempt to provide a practical evident that proves the importance of spatial analysis, with a particular focus on the importance of using regression models spatial andthat includes all of them spatial dependence, which we can test its presence or not by using Moran test. While ignoring this dependency may lead to the loss of important information about the phenomenon under research is reflected in the end on the strength of the statistical estimation power, as these models are the link between the usual regression models with time-series models. Spatial analysis had
... Show MoreHere, we found an estimation of best approximation of unbounded functions which satisfied weighted Lipschitz condition with respect to convex polynomial by means of weighted Totik-Ditzian modulus of continuity
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MorePurpose: This study aimed to assess the thickness of alveolar bone of maxillary and mandibular incisors from orthodontics perspective. Materials and Method: A total of 73 Cone beam computed tomography for Iraqi patients (47 females and 26 males) were included in this study. The selected images were captured and imported to AutoCAD database software to perform the measurement. To measure alveolar bone thickness, a reference line was drawn through the long axis of each incisor, from the incisal edge to the root apex. Then, labial and lingual/palatal perpendicular lines were drawn to the reference line at 3, 6, and 9mm apically from the cemento-enamel junction (CEJ). Results: The buccal bone is generally thinner than the lingual/palata
... Show MoreThis paper presents a novel idea as it investigates the rescue effect of the prey with fluctuation effect for the first time to propose a modified predator-prey model that forms a non-autonomous model. However, the approximation method is utilized to convert the non-autonomous model to an autonomous one by simplifying the mathematical analysis and following the dynamical behaviors. Some theoretical properties of the proposed autonomous model like the boundedness, stability, and Kolmogorov conditions are studied. This paper's analytical results demonstrate that the dynamic behaviors are globally stable and that the rescue effect improves the likelihood of coexistence compared to when there is no rescue impact. Furthermore, numerical simul
... Show MoreThe problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.