Experimental activity coefficients at infinite dilution are particularly useful for calculating the parameters needed in an expression for the excess Gibbs energy. If reliable values of γ∞1 and γ∞2 are available, either from direct experiment or from a correlation, it is possible to predict the composition of the azeotrope and vapor-liquid equilibrium over the entire range of composition. These can be used to evaluate two adjustable constants in any desired expression for G E. In this study MOSCED model and SPACE model are two different methods were used to calculate γ∞1 and γ∞2
The aim for this research is to investigate the effect of inclusion of crack incidence into the 2D numerical model of the masonry units and bonding mortar on the behavior of unreinforced masonry walls supporting a loaded reinforced concrete slab. The finite element method was implemented for the modeling and analysis of unreinforced masonry walls. In this paper, ABAQUS, FE software with implicit solver was used to model and analyze unreinforced masonry walls which are subjected to a vertical load. Detailed Micro Modeling technique was used to model the masonry units, mortar and unit-mortar interface separately. It was found that considering potential pure tensional cracks located vertically in the middle of the mortar and units show
... Show MoreThe main aim of this paper is studied the punching shear and behavior of reinforced concrete slabs exposed to fires, the possibility of punching shear failure occurred as a result of the fires and their inability to withstand the loads. Simulation by finite element analysis is made to predict the type of failure, distribution temperature through the thickness of the slabs, deformation and punching strength. Nonlinear finite element transient thermal-structural analysis at fire conditions are analyzed by ANSYS package. The validity of the modeling is performed for the mechanical and thermal properties of materials from earlier works from literature to decrea
... Show MoreFour simply supported reinforced concrete (RC) beams were test experimentaly and analyzed using the extended finite element method (XFEM). This method is used to treat the discontinuities resulting from the fracture process and crack propagation in that occur in concrete. The Meso-Scale Approach (MSA) used to model concrete as a heterogenous material consists of a three-phasic material (coarse aggregate, mortar, and air voids in the cement paste). The coarse aggregate that was used in the casting of these beams rounded and crashed aggregate shape with maximum size of 20 mm. The compressive strength used in these beams is equal to 17 MPa and 34 MPa, respectively. These RC beams are designed to fail due to flexure when subjected to lo
... Show MoreThe analysis of rigid pavements is a complex mission for many reasons. First, the loading conditions include the repetition of parts of the applied loads (cyclic loads), which produce fatigue in the pavement materials. Additionally, the climatic conditions reveal an important role in the performance of the pavement since the expansion or contraction induced by temperature differences may significantly change the supporting conditions of the pavement. There is an extra difficulty because the pavement structure is made of completely different materials, such as concrete, steel, and soil, with problems related to their interfaces like contact or friction. Because of the problem's difficulty, the finite element simulation is
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreMethods of estimating statistical distribution have attracted many researchers when it comes to fitting a specific distribution to data. However, when the data belong to more than one component, a popular distribution cannot be fitted to such data. To tackle this issue, mixture models are fitted by choosing the correct number of components that represent the data. This can be obvious in lifetime processes that are involved in a wide range of engineering applications as well as biological systems. In this paper, we introduce an application of estimating a finite mixture of Inverse Rayleigh distribution by the use of the Bayesian framework when considering the model as Markov chain Monte Carlo (MCMC). We employed the Gibbs sampler and
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreThis study sought to investigate the impacts of big data, artificial intelligence (AI), and business intelligence (BI) on Firms' e-learning and business performance at Jordanian telecommunications industry. After the samples were checked, a total of 269 were collected. All of the information gathered throughout the investigation was analyzed using the PLS software. The results show a network of interconnections can improve both e-learning and corporate effectiveness. This research concluded that the integration of big data, AI, and BI has a positive impact on e-learning infrastructure development and organizational efficiency. The findings indicate that big data has a positive and direct impact on business performance, including Big
... Show More