The behavior and shear strength of full-scale (T-section) reinforced concrete deep beams, designed according to the strut-and-tie approach of ACI Code-19 specifications, with various large web openings were investigated in this paper. A total of 7 deep beam specimens with identical shear span-to-depth ratios have been tested under mid-span concentrated load applied monotonically until beam failure. The main variables studied were the effects of width and depth of the web openings on deep beam performance. Experimental data results were calibrated with the strut-and-tie approach, adopted by ACI 318-19 code for the design of deep beams. The provided strut-and-tie design model in ACI 318-19 code provision was assessed and found to be unsatisfactory for deep beams with large web openings. A simplified empirical equation to estimate the shear strength for deep T-beams with large web openings based on the strut-and-tie model was proposed and verified with numerical analysis. The numerical study considered three-dimensional finite element models, in ABAQUS software, that have been developed to simulate and predict the performance of deep beams. The results of numerical simulations were in good agreement and exhibited close correlation with the experimental data. The test results showed that the enlargement in the size of web openings substantially reduces the elements' shear capacity. The experiments revealed that increasing the width of the openings has more effect than the depth at reducing the load-carrying capacity.
The proliferation of many editing programs based on artificial intelligence techniques has contributed to the emergence of deepfake technology. Deepfakes are committed to fabricating and falsifying facts by making a person do actions or say words that he never did or said. So that developing an algorithm for deepfakes detection is very important to discriminate real from fake media. Convolutional neural networks (CNNs) are among the most complex classifiers, but choosing the nature of the data fed to these networks is extremely important. For this reason, we capture fine texture details of input data frames using 16 Gabor filters indifferent directions and then feed them to a binary CNN classifier instead of using the red-green-blue
... Show MoreIn this paper new methods were presented based on technique of differences which is the difference- based modified jackknifed generalized ridge regression estimator(DMJGR) and difference-based generalized jackknifed ridge regression estimator(DGJR), in estimating the parameters of linear part of the partially linear model. As for the nonlinear part represented by the nonparametric function, it was estimated using Nadaraya Watson smoother. The partially linear model was compared using these proposed methods with other estimators based on differencing technique through the MSE comparison criterion in simulation study.
The prediction process of time series for some time-related phenomena, in particular, the autoregressive integrated moving average(ARIMA) models is one of the important topics in the theory of time series analysis in the applied statistics. Perhaps its importance lies in the basic stages in analyzing of the structure or modeling and the conditions that must be provided in the stochastic process. This paper deals with two methods of predicting the first was a special case of autoregressive integrated moving average which is ARIMA (0,1,1) if the value of the parameter equal to zero, then it is called Random Walk model, the second was the exponential weighted moving average (EWMA). It was implemented in the data of the monthly traff
... Show MorePrediction of penetration rate (ROP) is important process in optimization of drilling due to its crucial role in lowering drilling operation costs. This process has complex nature due to too many interrelated factors that affected the rate of penetration, which make difficult predicting process. This paper shows a new technique of rate of penetration prediction by using artificial neural network technique. A three layers model composed of two hidden layers and output layer has built by using drilling parameters data extracted from mud logging and wire line log for Alhalfaya oil field. These drilling parameters includes mechanical (WOB, RPM), hydraulic (HIS), and travel transit time (DT). Five data set represented five formations gathered
... Show MoreIt included the introduction to the research and its importance, as the knee joint is one of the important joints in the human body that are susceptible to injury, and among these injuries is the roughness of the knee that occurs as a result of weakness and imbalance in the work of the quadriceps muscle, so its treatment is through rehabilitation exercises to treat weakness and gain flexibility and strength.Hence the importance of the research by developing rehabilitation exercises with different resistances in the water medium and restoring flexibility and muscular strength for patients with knee roughness for ages from 30-40 years, and the experimental method was used to solve the research problem, and the research sample included (6) of
... Show MoreThis research a study model of linear regression problem of autocorrelation of random error is spread when a normal distribution as used in linear regression analysis for relationship between variables and through this relationship can predict the value of a variable with the values of other variables, and was comparing methods (method of least squares, method of the average un-weighted, Thiel method and Laplace method) using the mean square error (MSE) boxes and simulation and the study included fore sizes of samples (15, 30, 60, 100). The results showed that the least-squares method is best, applying the fore methods of buckwheat production data and the cultivated area of the provinces of Iraq for years (2010), (2011), (2012),
... Show MoreA legal discourse in the Qur’an and Sunnah is almost devoid of the use of one of the general formulas, and due to its frequent rotation in the tongue of the legislator, the formulas may overlap their members in apparently contradictory provisions, which makes the individual from the general members appear to the beholder to be covered by two contradictory provisions, and this research came to present what might happen to him The legal text interpreter of weighting between the two opposing texts is the strength of the generality that is established by the generality formula, so the two strongest formulas in the inclusion of its members outweigh the weaker of them and precede them, and the research decided that the formulas vary
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.