Hygienic engineering has dedicated a lot of time and energy to studying water filtration because of how important it is to human health. Thorough familiarity with the filtration process is essential for the design engineer to keep up with and profit from advances in filtering technology and equipment as the properties of raw water continue to change. Because it removes sediment, chemicals, odors, and microbes, filtration is an integral part of the water purification process. The most popular technique for treating surface water for municipal water supply is considered fast sand filtration, which can be achieved using either gravity or pressure sand filters. Predicting the performance of units in water treatment plants is a basic principle. For that reason, this research was executed to compare gravity and pressure sand filters in terms of construction, use, efficiency, filtration rate, cost, benefit, and drawbacks to predict the performance of those units under different conditions and from an economic standpoint. It also served as a presentation and review of previous studies dealing with the evaluation and development of pressure and gravity filters. This paper gives a brief overview of filtration theory, the types and properties of filter media, filter backwashing, and operational problems that can be avoided in the filtration process.
The gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show MoreFresh water resources in terms of water quality is a crucial issue worldwide. In Egypt, the Nile River is the main source of fresh water in the country and monitoring its water quality is a major task on governments and research levels. In the present case study, the physical, chemical and algal distribution in Nile River was monitored over two seasons (winter and summer) in 2019. The aims of the study were to check the seasonal variation among the different water parameters and also to check the correlations between those parameters. Water samples were collected from the Nile in Cairo governorate in EGYPT. The different physiochemical and microbiological properties in water samples were assessed. The studied parameters were included: te
... Show MoreThroughout the ages,the methods of human production, exchange, and communication have not changed,and lifestyles have not witnessed rapid and comprehensive changes except since the presence of advanced and modern technologies for information and communication, which led to the emergence of new patterns of intellectual works and innovations that are dealt with and circulated through the virtual medium.These innovations are in many legal disputes, and domain names are one of the most important foundations of information networks, as they are the key to entering the virtual world and distinguishing websites. Because of the novelty of domain names,many attacks have occurred on them, which are closely related to intellectual property rights. And
... Show MoreAnalysis the economic and financial phenomena and other requires to build the appropriate model, which represents the causal relations between factors. The operation building of the model depends on Imaging conditions and factors surrounding an in mathematical formula and the Researchers target to build that formula appropriately. Classical linear regression models are an important statistical tool, but used in a limited way, where is assumed that the relationship between the variables illustrations and response variables identifiable. To expand the representation of relationships between variables that represent the phenomenon under discussion we used Varying Coefficient Models
... Show MoreThe study was aimed to determine the coordinates of the points were measured by different ways and different instruments, the most precise way using the differential global positioning system (DGPS) that will be the reference measurements in comparison, less precise way using navigator GPS. Google earth (pro.), and the other applications of GPS mobile ( Samsung and I-phone). In this research (8 points) were chosen that are occasional in location. The comparison of the different observations can give us an idea of the extent to which the accuracy of the observations differs from the different devices used in the observing, as well as through the knowledge of the best device and the best way to measure coordinates accurately t
... Show MoreThis narrative review focused on research investigating the impact of loneliness on the prevalence of dementia and its relationship with other risk factors. A comprehensive and rigorous search was conducted using a variety of scientific databases with specific keywords to identify all prior studies that examined the correlation between dementia and loneliness. The inquiry was confined to articles published in English from January 2017 to March 2024. The narrative review identified a consensus regarding the role of loneliness in enhancing the risk of all‐cause dementia, with a particular emphasis on the subjective perception of loneliness. This phenomenon may be caused by the sensations of exclusion, discrimination, and alienation that are
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreIn this paper, double Sumudu and double Elzaki transforms methods are used to compute the numerical solutions for some types of fractional order partial differential equations with constant coefficients and explaining the efficiently of the method by illustrating some numerical examples that are computed by using Mathcad 15.and graphic in Matlab R2015a.