The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The proposed method can determine the damage variables at the start of the loading process, and this variable continues to increase as the load progresses until complete failure. The results obtained using this method were assessed through previous studies, whereas three case studies for concrete specimens and reinforced concrete structural elements (columns and gable beams) were considered. Additionally, finite element models were also developed and verified. The results revealed good agreement in each case. Furthermore, the results show that the proposed method outperforms other methods in terms of damage prediction, particularly when damage is calculated using the stress ratio. Doi: 10.28991/CEJ-2022-08-02-03 Full Text: PDF
The existence of the Internet, networking, and cloud computing support a wide range of new technologies. Blockchain is one of these technologies; this increases the interest of researchers who are concerned with providing a safe environment for the circulation of important information via the Internet. Maintaining solidity and integrity of a blockchain’s transactions is an important issue, which must always be borne in mind. Transactions in blockchain are based on use of public and private keys asymmetric cryptography. This work proposes usage of users’ DNA as a supporting technology for storing and recovering their keys in case those keys are lost — as an effective bio-cryptographic recovery method. The RSA private key is
... Show MorePhotonic crystal fiber interferometers are widely used for sensing applications. In this work, solid core-Photonic crystal fiber based on Mach-Zehnder modal interferometer for sensing refractive index was presented. The general structure of sensor applied by splicing short lengths of PCF in both sides with conventional single mode fiber (SMF-28). To apply modal interferometer theory; collapsing technique based on fusion splicing used to excite higher order modes (LP01 and LP11). Laser diode (1550 nm) has been used as a pump light source. Where a high sensitive optical spectrum analyzer (OSA) was used to monitor and record the transmitted. The experimental work shows that the interference spectrum of Photonic crystal fiber interferometer
... Show MoreBackground: Obesity tends to appear in modern societies and constitutes a significant public health problem with an increased risk of cardiovascular diseases.
Objective: This study aims to determine the agreement between actual and perceived body image in the general population.
Methods: A descriptive cross-sectional study design was conducted with a sample size of 300. The data were collected from eight major populated areas of Northern district of Karachi Sindh with a period of six months (10th January 2020 to 21st June 2020). The Figure rating questionnaire scale (FRS) was applied to collect the demographic data and perception about body weight. Body mass index (BMI) used for ass
... Show MoreThe conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc
... Show MoreProfessional learning societies (PLS) are a systematic method for improving teaching and learning performance through designing and building professional learning societies. This leads to overcoming a culture of isolation and fragmenting the work of educational supervisors. Many studies show that constructing and developing strong professional learning societies - focused on improving education, curriculum and evaluation will lead to increased cooperation and participation of educational supervisors and teachers, as well as increases the application of effective educational practices in the classroom.
The roles of the educational supervisor to ensure the best and optimal implementation and activation of professional learning soci
... Show More