Flexure members such as reinforced concrete (RC) simply supported beams subjected to two-point loading were analyzed numerically. The Extended Finite Element Method (XFEM) was employed for the treatment the non-smooth h behaviour such as discontinuities and singularities. This method is a powerful technique used for the analysis of the fracture process and crack propagation in concrete. Concrete is a heterogeneous material that consists of coarse aggregate, cement mortar and air voids distributed in the cement paste. Numerical modeling of concrete comprises a two-scale model, using mesoscale and macroscale numerical models. The effectiveness and validity of the Meso-Scale Approach (MSA) in modeling of the reinforced concrete beams with minimum reinforcement was studied. ABAQUS program was utilized for Finite Element (FE) modeling and analysis of the beams. On the other hand, mesoscale modeling of concrete constituents was executed with the aid of ABAQUS PYTHON language and programing using excel sheets. The concrete beams under flexure were experimentally investigated as well as by the numerical analysis. The comparison between experimental and numerical results showed that the mesoscale model gives a better indication for representing the concrete models in the numerical approach and a more appropriate result when compared with the experimental results.
Removal of solar brown and direct black dyes by coagulation with two aluminum based
coagulants was conducted. The main objective is to examine the efficiency of these
coagulants in the treatment of dye polluted water discharged from Al-Kadhymia Textile
Company (Baghdad-Iraq). The performance of these coagulants was investigated through
jar test by comparing dye percent removal at different wastewater pH, coagulant dose,
and initial dye concentration. Results show that alum works better than PAC under acidic
media (5-6) and PAC works better under basic media (7-8) in the removal of both solar
brown and direct black dyes. Higher doses of PAC were required to achieve the
maximum removal efficiency under optimum pH co
Human Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that
... Show MoreUsing the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through that region. The result
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreSpraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
A content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreThe need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as v
... Show More