In this study water quality was indicated in terms of Water Quality Index that was determined through summarizing multiple parameters of water test results. This index offers a useful representation of the overall quality of water for public or any intended use as well as indicating pollution, water quality management and decision making. The application of Water Quality Index (WQI) with sixteen physicochemical water quality parameters was performed to evaluate the quality of Tigris River water for drinking usage. This was done by subjecting the water samples collected from eight stations in Baghdad city during the period 2004-2010 to comprehensive physicochemical analysis. The sixteen physicochemical parameters included: Turbidity, Alkalinity (TA), Total Hardness (TH), Calcium (Ca), Magnesium (Mg), Iron (Fe), pH value, Electrical Conductivity (EC), Sulphate (SO4-2), Chloride (Cl-), Total Solids (TS), Total Suspended Solids (TSS), Nitrite (NO2-), Nitrate (NO3-), Ammonia (NH3), and Orthophosphate (PO4-3). The average annual overall WQI was 224.32 through the study period. The high value of average annual overall WQI obtained is a result of the high concentrations of turbidity, total hardness, electrical conductivity, and total solids which can be attributed to the various human activities taking place at the river banks. From this analysis the quality of the Tigris River is classified as "very poor quality" ranging poor water at the river upstream near Al-Karhk WTP and unsuitable for drinking at the river downstream near Al-Wahda WTP and would need further treatment. The present study demonstrated the application of WQI in estimating and understanding the water quality of Tigris River. WQI appears to be promising in water quality management and a valuable tool in categorizing pollution sources in surface waters.
Cover crops (CC) improve soil quality, including soil microbial enzymatic activities and soil chemical parameters. Scientific studies conducted in research centers have shown positive effects of CC on soil enzymatic activities; however, studies conducted in farmer fields are lacking in the literature. The objective of this study was to quantify CC effects on soil microbial enzymatic activities (β-glucosidase, β-glucosaminidase, fluorescein diacetate hydrolase, and dehydrogenase) under a corn (Zea mays L.)–soybean (Glycine max (L.) Merr.) rotation. The study was conducted in 2016 and 2018 in Chariton County, Missouri, where CC were first established in 2012. All tested soil enzyme levels were significantly different between 2016 and 2018
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreSemi-parametric models analysis is one of the most interesting subjects in recent studies due to give an efficient model estimation. The problem when the response variable has one of two values either 0 ( no response) or one – with response which is called the logistic regression model.
We compare two methods Bayesian and . Then the results were compared using MSe criteria.
A simulation had been used to study the empirical behavior for the Logistic model , with different sample sizes and variances. The results using represent that the Bayesian method is better than the at small samples sizes.
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Wireless sensor networks (WSNs) are emerging in various application like military, area monitoring, health monitoring, industry monitoring and many more. The challenges of the successful WSN application are the energy consumption problem. since the small, portable batteries integrated into the sensor chips cannot be re-charged easily from an economical point of view. This work focusses on prolonging the network lifetime of WSNs by reducing and balancing energy consumption during routing process from hop number point of view. In this paper, performance simulation was done between two types of protocols LEACH that uses single hop path and MODLEACH that uses multi hop path by using Intel Care i3 CPU (2.13GHz) laptop with MATLAB (R2014a). Th
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreFeature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show MoreIn this paper, a fixed point theorem of nonexpansive mapping is established to study the existence and sufficient conditions for the controllability of nonlinear fractional control systems in reflexive Banach spaces. The result so obtained have been modified and developed in arbitrary space having Opial’s condition by using fixed point theorem deals with nonexpansive mapping defined on a set has normal structure. An application is provided to show the effectiveness of the obtained result.
'Steganography is the science of hiding information in the cover media', a force in the context of information sec, IJSR, Call for Papers, Online Journal