The influence and hazard of fire flame are one of the most important parameters that affecting the durability and strength of structural members. This research studied the influence of fire flame on the behavior of reinforced concrete beams affected by repeated load. Nine self- compacted reinforced concrete beams were castellated, all have the same geometric layout (0.15x0.15x1.00) m, reinforcement details and compressive strength (50 Mpa). To estimate the effect of fire flame disaster, four temperatures were adopted (200, 300, 400 and 500) oC and two method of cooling were used (graduated and sudden). In the first cooling method, graduated, the tested beams were leaved to cool in air while in the second method, sudden, water splash was used to reduce the temperature. Eight of the tested beams were divided in to four groups, each were burned to one of the adopted temperature for about half an hour and cooled by the adopted cooling methods (one by sudden cooling and the other by graduated cooling). After burning and cooling the beams were tested under the effect of repeated load (loading – unloading) for five cycle and then up to failure. As a compared with the non- burned beam, the results indicated that the ultimate load capacity of the tested beams were reduced by (16, 23, 54 and 71)% after being burned to (200, 300, 400 and 500) oC , respectively, for a case of sudden cooling and by (8, 14, 36 and 64)% , respectively, for a case of graduated cooling. It was also found that the effect of sudden cooling was greater than that in a case of graduated cooling. Regarding the failure mode, there was a different between the non-burred beam and the other ones even that all of them had the same geometric layout, compressive strength and reinforcement details. The failure mode for all burned beams was combined shear- flexure failure which was belong to the reduction in the compressive strength of the concrete due to the effect of the temperature rising , while the failure mode of the non-burned beam was flexure failure which was compatible with the preliminary design. It was also detected that the residual deflection proportion directly with the temperature, as the temperature increase to (200, 300, 400 and 500) oC the residual deflection compared with the non-burned beam increased by (32, 48, 326 and 358)% for a case of sudden cooling and by (13, 29, 303 and 332)% for a case of graduated cooling. Another effect was appear represented by the method of cooling, the results showed that the sudden cooling had more effect on the residual deflection than the graduated cooling by (15-6)% approximately. To vanish the residual deflection, numbers of cycle (loading-unloading) were required. It was found that this number increase as the temperature of burning increased and it’s also larger in a case of sudden cooling.
The concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show MoreRadiation measuring devices need to periodic calibration process to examine their sensitivity and the extent of the response. This study is used to evaluate the radiation doses of the workers in the laboratories of the Directorate of Safety as a result of the use of point sources in calibrating of the devices in two ways, the first is the direct measurement by the FAG device and the others using RESRAD and RAD PRO programs. The total doses values using FAG were (2.57 μSv/y, 102.3 μSv/y and 20.75 μSv/y for TLD laboratory, Gamma spectroscopy analyses (GSA) laboratory and equipment store respectively, and the total doses that calculated using RESRAD and RAD PRO were 1.518 μSv/y, 76.65 μSv/y and 21.2 μSv/y for the above laboratories. t
... Show MoreA remarkable correlation between chaotic systems and cryptography has been established with sensitivity to initial states, unpredictability, and complex behaviors. In one development, stages of a chaotic stream cipher are applied to a discrete chaotic dynamic system for the generation of pseudorandom bits. Some of these generators are based on 1D chaotic map and others on 2D ones. In the current study, a pseudorandom bit generator (PRBG) based on a new 2D chaotic logistic map is proposed that runs side-by-side and commences from random independent initial states. The structure of the proposed model consists of the three components of a mouse input device, the proposed 2D chaotic system, and an initial permutation (IP) table. Statist
... Show MoreSustainable vegetative management plays a significant role in improving soil quality in degraded agricultural landscapes by enhancing soil microbial biomass. This study investigated the effects of grass buffers (GBs), biomass crops (BCs), grass waterways (GWWs), and agroforestry buffers (ABs) on soil microbial biomass and soil organic C (SOC) compared with continuous corn (
Wireless Body Area Sensor Networks (WBASNs) have garnered significant attention due to the implementation of self-automaton and modern technologies. Within the healthcare WBASN, certain sensed data hold greater significance than others in light of their critical aspect. Such vital data must be given within a specified time frame. Data loss and delay could not be tolerated in such types of systems. Intelligent algorithms are distinguished by their superior ability to interact with various data systems. Machine learning methods can analyze the gathered data and uncover previously unknown patterns and information. These approaches can also diagnose and notify critical conditions in patients under monitoring. This study implements two s
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreSocial media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreIn this article, we design an optimal neural network based on new LM training algorithm. The traditional algorithm of LM required high memory, storage and computational overhead because of it required the updated of Hessian approximations in each iteration. The suggested design implemented to converts the original problem into a minimization problem using feed forward type to solve non-linear 3D - PDEs. Also, optimal design is obtained by computing the parameters of learning with highly precise. Examples are provided to portray the efficiency and applicability of this technique. Comparisons with other designs are also conducted to demonstrate the accuracy of the proposed design.
Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over
... Show More