The behavior and shear strength of full-scale (T-section) reinforced concrete deep beams, designed according to the strut-and-tie approach of ACI Code-19 specifications, with various large web openings were investigated in this paper. A total of 7 deep beam specimens with identical shear span-to-depth ratios have been tested under mid-span concentrated load applied monotonically until beam failure. The main variables studied were the effects of width and depth of the web openings on deep beam performance. Experimental data results were calibrated with the strut-and-tie approach, adopted by ACI 318-19 code for the design of deep beams. The provided strut-and-tie design model in ACI 318-19 code provision was assessed and found to be unsatisfactory for deep beams with large web openings. A simplified empirical equation to estimate the shear strength for deep T-beams with large web openings based on the strut-and-tie model was proposed and verified with numerical analysis. The numerical study considered three-dimensional finite element models, in ABAQUS software, that have been developed to simulate and predict the performance of deep beams. The results of numerical simulations were in good agreement and exhibited close correlation with the experimental data. The test results showed that the enlargement in the size of web openings substantially reduces the elements' shear capacity. The experiments revealed that increasing the width of the openings has more effect than the depth at reducing the load-carrying capacity.
The purpose of this research is to study the quality of scientific research at the University of Baghdad in light of scientific piracy and plagiarism of research and results and attribute it to others intentionally or unintentionally. Proactive writing such as stealing ideas or synthesizing the results of one another over others and its negative impact on the quality of scientific outputs and the reputation of educational organizations through an exploratory study in the faculties of the University of Baghdad, scientific and humanitarian. As for the aims of the study, it was determined by determining the negative impact of piracy on scientific research. A Likert five-point scale was used in this research. The research community c
... Show MoreGenerally, statistical methods are used in various fields of science, especially in the research field, in which Statistical analysis is carried out by adopting several techniques, according to the nature of the study and its objectives. One of these techniques is building statistical models, which is done through regression models. This technique is considered one of the most important statistical methods for studying the relationship between a dependent variable, also called (the response variable) and the other variables, called covariate variables. This research describes the estimation of the partial linear regression model, as well as the estimation of the “missing at random” values (MAR). Regarding the
... Show MoreThe aim of this study was to determine the effect on using the McCarthy Model (4MAT) for developing creative writing skills and reflective thinking among undergraduate students. The quasi-experimental approach was adopted. And, in order to achieve the study objective, the educational content of Teaching Ethics (Approach 401), for the plan for the primary grades teacher preparation program was dealt with by using a teaching program based on the McCarthy Model (4MAT) was used.
The study which was done had been based on the academic achievement test for creative writing skills, and the reflective thinking test. The validity and reliability of the study tools were also confirmed. The study was applied to a sample consisting of
... Show More
The aim of this study to identity using Daniel's model and Driver’s model in learning a kinetic chain on the uneven bars in the artistic gymnastics for female students. The researchers used the experimental method to design equivalent groups with a preand post-test, and the research community was identified with the students of the third stage in the college for the academic year 2020-2021 .The subject was, (3) class were randomly selected, so (30) students distributed into (3) groups). has been conducted pretesting after implementation of the curriculum for (4) weeks and used the statistical bag of social sciences(SPSS)to process the results of the research and a set of conclusions was reached, the most important of which is t
... Show MoreComputer models are used in the study of electrocardiography to provide insight into physiological phenomena that are difficult to measure in the lab or in a clinical environment.
The electrocardiogram is an important tool for the clinician in that it changes characteristically in a number of pathological conditions. Many illnesses can be detected by this measurement. By simulating the electrical activity of the heart one obtains a quantitative relationship between the electrocardiogram and different anomalies.
Because of the inhomogeneous fibrous structure of the heart and the irregular geometries of the body, finite element method is used for studying the electrical properties of the heart.
This work describes t
... Show MoreBackground: The present study was conducted to evaluate the effects of different bleaching methods on the shear bond strength of orthodontic Sapphire brackets bonded to human premolars teeth using light cured composite resin and to determine the predominant site of bond failure. Materials and Methods: Thirty freshly extracted human premolars were selected and randomly divided into three groups (10 per group). These groups are: control (unbleached) group, hydrogen peroxide group (HP) 37.5% ; which is the in- office bleaching method group, carbamide peroxide group (CP) 16%; which is the at- home bleaching method group. After bleaching process was performed, all the teeth stored in distilled water in a sealed container at room temperature for
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
In this paper, an ecological model with stage-structure in prey population, fear, anti-predator and harvesting are suggested. Lotka-Volterra and Holling type II functional responses have been assumed to describe the feeding processes . The local and global stability of steady points of this model are established. Finally, the global dynamics are studied numerically to investigate the influence of the parameters on the solutions of the system, especially the effect of fear and anti-predation.