The behavior and shear strength of full-scale (T-section) reinforced concrete deep beams, designed according to the strut-and-tie approach of ACI Code-19 specifications, with various large web openings were investigated in this paper. A total of 7 deep beam specimens with identical shear span-to-depth ratios have been tested under mid-span concentrated load applied monotonically until beam failure. The main variables studied were the effects of width and depth of the web openings on deep beam performance. Experimental data results were calibrated with the strut-and-tie approach, adopted by ACI 318-19 code for the design of deep beams. The provided strut-and-tie design model in ACI 318-19 code provision was assessed and found to be unsatisfactory for deep beams with large web openings. A simplified empirical equation to estimate the shear strength for deep T-beams with large web openings based on the strut-and-tie model was proposed and verified with numerical analysis. The numerical study considered three-dimensional finite element models, in ABAQUS software, that have been developed to simulate and predict the performance of deep beams. The results of numerical simulations were in good agreement and exhibited close correlation with the experimental data. The test results showed that the enlargement in the size of web openings substantially reduces the elements' shear capacity. The experiments revealed that increasing the width of the openings has more effect than the depth at reducing the load-carrying capacity.
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreThis paper is based on the Sentinel-2 satellite data: the thermal, red, and NIR bands. The Babylon city was chosen in this study for different reasons: its location in the middle of Iraq and it represents the largest capitals of the Mesopotamia civilization in the word. The Land Surface Temperature (LST) was determined using a method that incorporates remote sensing, geographic information systems, and statistics. This process has made it possible to monitor the relationship between land usage and the land surface temperature for four seasons in the year 2021. The mapswere processed and analyzed by using ArcGIS software. Five maps of the LST were constructed. Each map represents diffe