In this study, the Earth's surface was studied in Razzaza Lake for 25 years, using remote sensing methods. Images of the satellites Landsat 5 (TM) and 8 (OLI) were used to study and determine the components of the land cover. The study covered the years 1995-2021 with an interval of 5 years, as this region is uninhabited, so the change in the land cover is slow. The land cover was divided into three main classes and seven subclasses and classified using the maximum likelihood classifier with the help of training sets collected to represent the classes that made up the land cover. The changes detected in the land cover were studied by considering 1995 as a reference year. It was found that there was a significant reduction in the water mass
... Show MoreIn this study, the Earth's surface was studied in Razzaza Lake for 25 years, using remote sensing methods. Images of the satellites Landsat 5 (TM) and 8 (OLI) were used to study and determine the components of the land cover. The study covered the years 1995-2021 with an interval of 5 years, as this region is uninhabited, so the change in the land cover is slow. The land cover was divided into three main classes and seven subclasses and classified using the maximum likelihood classifier with the help of training sets collected to represent the classes that made up the land cover. The changes detected in the land cover were studied by considering 1995 as a reference year. It was found that there was a significant reduction in the water
... Show MoreObtaining the computational models for the functioning of the brain gives us a chance to understand the brain functionality thoroughly. This would help the development of better treatments for neurological illnesses and disorders. We created a cortical model using Python language using the Brian simulator. The Brian simulator is specialized in simulating the neuronal connections and synaptic interconnections. The dynamic connection model has multiple parameters in order to ensure an accurate simulation (Bowman, 2016). We concentrated on the connection weights and studied their effect on the interactivity and connectivity of the cortical neurons in the same cortical layer and across multiple layers. As synchronization helps us to mea
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show More
Abstract
Due to the momentum of winning in the streets of the city of Baghdad as a result of the large number of checkpoints so felt researcher to conduct a field visit to find out the main reasons that led to this congestion and to find practical solutions to mitigate wastage winning the arrival time citizen to where you want the least possible time.
This research aims to overcome the difficulties experienced by citizens to reach their places of work and reduce waste at the time of service and waiting time as well as reduce the cost of waiting.
Has emerged study a set of conclusions, including the use of model queue (G / G / C) and the mome
... Show MoreThis paper is concerned with Double Stage Shrinkage Bayesian (DSSB) Estimator for lowering the mean squared error of classical estimator ˆ q for the scale parameter (q) of an exponential distribution in a region (R) around available prior knowledge (q0) about the actual value (q) as initial estimate as well as to reduce the cost of experimentations. In situation where the experimentations are time consuming or very costly, a Double Stage procedure can be used to reduce the expected sample size needed to obtain the estimator. This estimator is shown to have smaller mean squared error for certain choice of the shrinkage weight factor y( ) and for acceptance region R. Expression for
... Show More