The goal of this research is to develop a numerical model that can be used to simulate the sedimentation process under two scenarios: first, the flocculation unit is on duty, and second, the flocculation unit is out of commission. The general equation of flow and sediment transport were solved using the finite difference method, then coded using Matlab software. The result of this study was: the difference in removal efficiency between the coded model and operational model for each particle size dataset was very close, with a difference value of +3.01%, indicating that the model can be used to predict the removal efficiency of a rectangular sedimentation basin. The study also revealed that the critical particle size was 0.01 mm, which means that most particles with diameters larger than 0.01 mm settled due to physical force, while most particles with diameters smaller than 0.01 mm settled due to flocculation process. At 10 m from the inlet zone, the removal efficiency was more than 60% of the total removal rate, indicating that increasing basin length is not a cost-effective way to improve removal efficiency. The influence of the flocculation process appears at particle sizes smaller than 0.01 mm, which is a small percentage (10%) of sieve analysis test. When the percentage reaches 20%, the difference in accumulative removal efficiency rises from +3.57% to 11.1% at the AL-Muthana sedimentation unit.
The aim of the thesis is to estimate the partial and inaccessible population groups, which is a field study to estimate the number of drug’s users in the Baghdad governorate for males who are (15-60) years old.
Because of the absence of data approved by government institutions, as well as the difficulty of estimating the numbers of these people from the traditional survey, in which the respondent expresses himself or his family members in some cases. In these challenges, the NSUM Network Scale-Up Method Is mainly based on asking respondents about the number of people they know in their network of drug addicts.
Based on this principle, a statistical questionnaire was designed to
... Show MoreIn this paper, the computational complexity will be reduced using a revised version of the selected mapping (SLM) algorithm. Where a partial SLM is achieved to reduce the mathematical operations around 50%. Although the peak to average power ratio (PAPR) reduction gain has been slightly degraded, the dramatic reduction in the computational complexity is an outshining achievement. Matlab simulation is used to evaluate the results, where the PAPR result shows the capability of the proposed method.
The basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreWA Shukur, journal of the college of basic education, 2011 The aim of this research is designing and implementing proposed steganographic method. The proposed steganographic method don’t use a specific type of digital media as a cover but it can use all types of digital media such as audio, all types of images, video and all types of files as a cover with the same of security, accuracy and quality of original data, considering that the size of embedded data must be smaller than the size of a cover. The proposed steganographic method hides embedded data at digital media without any changing and affecting the quality of the cover data. This means, the difference rate between cover before hiding operation and stego is zero. The proposed steg
... Show MoreTI1e Web service securi ty challenge is to understand and assess the risk involved in securing a web-based service today, based on our existing security technology, and at the same time tmck emerging standards and understand how they will be used to offset the risk in
new web services. Any security model must i llustrate how data can
now through an application and network topology to meet the
requirements defined by the busi ness wi thout exposing the data to undue risk. In this paper we propose &n
... Show MoreIt is through a review of conversion of vegetable oils into glycidyl ethers focusing on their roles in achieving sustainability and improved epoxy resin performance. It involves functionalization of triglycerides in the form of epoxidation followed by glycidylation and yields bio-based monomers having improved mechanical as well as thermal properties. The review covers the underlying chemistry, production drivers, industrial applications, and future issues, supported by quantitative data and comparative studies. In addition, it integrates recent data on catalyst choice, feedstock flexibility, and environmental performance factors of bio-based resins, indicating their suitability for replacing traditional petroleum-based components.<
... Show MoreThe current research is concerned with methods of formation and their effect on the sintering process of ceramic materials. The research is divided into a number of chapters. The first chapter addressed the research structure (the research problem, importance, objective, limits, and it also defined the terms used in the research). The second chapter addressed the theoretical framework, where the theoretical framework has been divided into three sections. The first section dealt with methods of formation of ceramic materials including: Plasticizing method 2- semi-dry pressing method 3- dry pressing method 4- extrusion method 5- casting method.
The researcher found that there is a clear difference between the methods through her formati
Abstract
Metal cutting processes still represent the largest class of manufacturing operations. Turning is the most commonly employed material removal process. This research focuses on analysis of the thermal field of the oblique machining process. Finite element method (FEM) software DEFORM 3D V10.2 was used together with experimental work carried out using infrared image equipment, which include both hardware and software simulations. The thermal experiments are conducted with AA6063-T6, using different tool obliquity, cutting speeds and feed rates. The results show that the temperature relatively decreased when tool obliquity increases at different cutting speeds and feed rates, also it
... Show More