The effluent quality improvement being discharged from wastewater treatment plants is essential to maintain an environment and healthy water resources. This study was carried out to evaluate the possibility of intermittent slow sand filtration as a promising tertiary treatment method for the sequencing batch reactor (SBR) effluent. Laboratory scale slow sand filter (SSF) of 1.5 UC and 0.1 m/h filtration rate, was used to study the process performance. It was found that SSF IS very efficient in oxidizing organic matter with COD removal efficiency up to 95%, also it is capable of removing considerable amounts of phosphate with 76% and turbidity with 87% removal efficiencies. Slow sand filter efficiently reduced the mass of suspended and dissolved material to a very high TSS and conductivity removal efficiency of about 99% for both of them. Therefore, it can be said that slow sand filtration would be a promising technology as a tertiary treatment of SBR reactor effluent, and economically achievable as a mean of upgrading wastewater effluents to meet more stringent water quality standards, where treated effluent can be reused for various recreational purposes i.e. gardening and irrigation, as well as for safe discharge.
The need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as v
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
In this paper, we describe the cases of marriage and divorce in the city of Baghdad on both sides of Rusafa and Karkh, we collected the data in this research from the Supreme Judicial Council and used the cubic spline interpolation method to estimate the function that passing through given points as well as the extrapolation method which was applied for estimating the cases of marriage and divorce for the next year and comparison between Rusafa and Karkh by using the MATLAB program.
Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreSimulation of direct current (DC) discharge plasma using
COMSOL Multiphysics software were used to study the uniformity
of deposition on anode from DC discharge sputtering using ring and
disc cathodes, then applied it experimentally to make comparison
between film thickness distribution with simulation results. Both
simulation and experimental results shows that the deposition using
copper ring cathode is more uniformity than disc cathode
The researchers have a special interest in studying Markov chains as one of the probability samples which has many applications in different fields. This study comes to deal with the changes issue that happen on budget expenditures by using statistical methods, and Markov chains is the best expression about that as they are regarded reliable samples in the prediction process. A transitional matrix is built for three expenditure cases (increase ,decrease ,stability) for one of budget expenditure items (base salary) for three directorates (Baghdad ,Nineveh , Diyala) of one of the ministries. Results are analyzed by applying Maximum likelihood estimation and Ordinary least squares methods resulting
... Show MoreIn modern technology, the ownership of electronic data is the key to securing their privacy and identity from any trace or interference. Therefore, a new identity management system called Digital Identity Management, implemented throughout recent years, acts as a holder of the identity data to maintain the holder’s privacy and prevent identity theft. Therefore, an overwhelming number of users have two major problems, users who own data and third-party applications will handle it, and users who have no ownership of their data. Maintaining these identities will be a challenge these days. This paper proposes a system that solves the problem using blockchain technology for Digital Identity Management systems. Blockchain is a powerful techniqu
... Show MoreIn this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show More