The flow measurements have increased importance in the last decades due to the shortage of water resources resulting from climate changes that request high control of the available water needed for different uses. The classical technique of open channel flow measurement by the integrating-float method was needed for measuring flow in different locations when there were no available modern devices for different reasons, such as the cost of devices. So, the use of classical techniques was taken place to solve the problem. The present study examines the integrating float method and defines the parameters affecting the acceleration of floating spheres in flowing water that was analyzed using experimental measurements. The method was investigated theoretically, as well as many experimental tests in a fixed floor laboratory flume were conducted. Different sizes of solid plastic spheres with different weights were used as floats to measure velocities and then discharge computation. The results indicate that the integrating-float technique is feasible and accurate for measuring low flow velocity in open channels. It was desirable to use small floats with specific gravity closer to unity to get more accurate results. The measured velocities and the estimated discharges were compared with discharges obtained using some other common laboratory measuring techniques. Good agreement was obtained between the integrating-float method results with the results of velocities obtained using other measurement techniques, with an error of less than 2.5%.
The reserve estimation process is continuous during the life of the field due to risk and inaccuracy that are considered an endemic problem thereby must be studied. Furthermore, the truth and properly defined hydrocarbon content can be identified just only at the field depletion. As a result, reserve estimation challenge is a function of time and available data. Reserve estimation can be divided into five types: analogy, volumetric, decline curve analysis, material balance and reservoir simulation, each of them differs from another to the kind of data required. The choice of the suitable and appropriate method relies on reservoir maturity, heterogeneity in the reservoir and data acquisition required. In this research, three types of rese
... Show More
Abstract
This research deals with Building A probabilistic Linear programming model representing, the operation of production in the Middle Refinery Company (Dura, Semawa, Najaif) Considering the demand of each product (Gasoline, Kerosene,Gas Oil, Fuel Oil ).are random variables ,follows certain probability distribution, which are testing by using Statistical programme (Easy fit), thes distribution are found to be Cauchy distribution ,Erlang distribution ,Pareto distribution ,Normal distribution ,and General Extreme value distribution . &
... Show MoreThe need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show MoreForest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show MoreUsing the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through tha
... Show MoreDoppler broadening of the 511 keV positron annihilation ??? ? was used to estimate the concentration of defects ?? different deformation levels of pure alnminum samples. These samples were compressed at room temperature to 15, 22, 28, 38,40, and 75 % thickness reduction. The two-state ^sitron-trapping model has been employed. 'I he s and w lineshape parameters were measured using high-resolution gamma spectrometer with high pure germanium detector of 2.1 keV resolution at 1.33 MeV of 60Co. The change of defects concentration (Co) with the deformation level (e) is found to obey an empirical formula of the form Cd - A £ B where A and ? are positive constants that depend mainly on the deformation procedure and the temperature at which the def
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.