The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the analysis of income inequality and wealth distribution using the Dagum model.
The advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of state-of-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages
... Show MoreInterest in belowground plant growth is increasing, especially in relation to arguments that shallow‐rooted cultivars are efficient at exploiting soil phosphorus while deep‐rooted ones will access water at depth. However, methods for assessing roots in large numbers of plants are diverse and direct comparisons of methods are rare. Three methods for measuring root growth traits were evaluated for utility in discriminating rice cultivars: soil‐filled rhizotrons, hydroponics and soil‐filled pots whose bottom was sealed with a non‐woven fabric (a potential method for assessing root penetration ability). A set of 38 rice genotypes including the Oryza
Colloidal crystals (opals) made of close-packed polymethylmethacrylate (PMMA) were fabricated and grown by Template-Directed methods to obtain porous materials with well-ordered periodicity and interconnected pore systems to manufacture photonic crystals. Opals were made from aqueous suspensions of monodisperse PMMA spheres with diameters between 280 and 415 nm. SEM confirmed the PMMA spheres crystallized uniformly in a face-centered cubic (FCC) array. Optical properties of synthesized pores PMMA were characterized by UV–Visible spectroscopy. It shows that the colloidal crystals possess pseudo photonic band gaps in the visible region. A combination of Bragg’s law of diffraction and Snell’s law of refraction were used to calculate t
... Show MoreFor modeling a photovoltaic module, it is necessary to calculate the basic parameters which control the current-voltage characteristic curves, that is not provided by the manufacturer. Generally, for mono crystalline silicon module, the shunt resistance is generally high, and it is neglected in this model. In this study, three methods are presented for four parameters model. Explicit simplified method based on an analytical solution, slope method based on manufacturer data, and iterative method based on a numerical resolution. The results obtained for these methods were compared with experimental measured data. The iterative method was more accurate than the other two methods but more complexity. The average deviation of
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreThe parametric programming considered as type of sensitivity analysis. In this research concerning to study the effect of the variations on linear programming model (objective function coefficients and right hand side) on the optimal solution. To determine the parameter (θ) value (-5≤ θ ≤5).Whereas the result، the objective function equal zero and the decision variables are non basic، when the parameter (θ = -5).The objective function value increases when the parameter (θ= 5) and the decision variables are basic، with the except of X24, X34.Whenever the parameter value increase, the objectiv
... Show MoreIncreasingly, public organizations in the federal state are required to work together, as well as to work with others to achieve their objectives. In Iraq there are two levels of organizations, including federal and local, and these organizations have been forced to work for many years in an environment in which the responsibility for service delivery is shared between policy makers and service providers, and between local governments and the federal government. It is sometimes difficult to manage the relationship between these organizations (federal and local) and do not always provide the best possible outcome of this relationship. This paper reviews how to manage the relationship between local administrations and
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.