Experimental activity coefficients at infinite dilution are particularly useful for calculating the parameters needed in an expression for the excess Gibbs energy. If reliable values of γ∞1 and γ∞2 are available, either from direct experiment or from a correlation, it is possible to predict the composition of the azeotrope and vapor-liquid equilibrium over the entire range of composition. These can be used to evaluate two adjustable constants in any desired expression for G E. In this study MOSCED model and SPACE model are two different methods were used to calculate γ∞1 and γ∞2
A novel robust finite time disturbance observer (RFTDO) based on an independent output-finite time composite control (FTCC) scheme is proposed for an air conditioning-system temperature and humidity regulation. The variable air volume (VAV) of the system is represented by two first-order mathematical models for the temperature and humidity dynamics. In the temperature loop dynamics, a RFTDO temperature (RFTDO-T) and an FTCC temperature (FTCC-T) are designed to estimate and reject the lumped disturbances of the temperature subsystem. In the humidity loop, a robust output of the FTCC humidity (FTCC-H) and RFTDO humidity (RFTDO-H) are also designed to estimate and reject the lumped disturbances of the humidity subsystem. Based on Lyapunov theo
... Show MoreThis work is concerned with building a three-dimensional (3D) ab-initio models that is capable of predicting the thermal distribution of laser direct joining processes between Polymethylmethacrylate (PMMA) and stainless steel 304(st.st.304). ANSYS® simulation based on finite element analysis (FEA) was implemented for materials joining in two modes; laser transmission joining (LTJ) and conduction joining (CJ). ANSYS® simulator was used to explore the thermal environment of the joints during joining (heating time) and after joining (cooling time). For both modes, the investigation is carried out when the laser spot is at the middle of the joint width, at 15 mm from the commencement point (joint edge) at traveling time of 3.75 s. Process par
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn the lifetime process in some systems, most data cannot belong to one single population. In fact, it can represent several subpopulations. In such a case, the known distribution cannot be used to model data. Instead, a mixture of distribution is used to modulate the data and classify them into several subgroups. The mixture of Rayleigh distribution is best to be used with the lifetime process. This paper aims to infer model parameters by the expectation-maximization (EM) algorithm through the maximum likelihood function. The technique is applied to simulated data by following several scenarios. The accuracy of estimation has been examined by the average mean square error (AMSE) and the average classification success rate (ACSR). T
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this paper, a least squares group finite element method for solving coupled Burgers' problem in 2-D is presented. A fully discrete formulation of least squares finite element method is analyzed, the backward-Euler scheme for the time variable is considered, the discretization with respect to space variable is applied as biquadratic quadrangular elements with nine nodes for each element. The continuity, ellipticity, stability condition and error estimate of least squares group finite element method are proved. The theoretical results show that the error estimate of this method is . The numerical results are compared with the exact solution and other available literature when the convection-dominated case to illustrate the effic
... Show MoreSeepage occurs under or inside structures or in the place, where they come into contact with the sides under the influence of pressure caused by the difference in water level in the structure U / S and D / S. This paper is designed to model seepage analysis for Kongele (an earth dam) due to its importance in providing water for agricultural projects and supporting Tourism sector. For this purpose, analysis was carried out to study seepage through the dam under various conditions. Using the finite element method by computer program (Geo-Studio) the dam was analysed in its actual design using the SEEP / W 2018 program. Several analyses were performed to study the seepage across Kongele
Two dimensional meso-scale concrete modeling was used in finite element analysis of plain concrete beam subjected to bending. The plane stress 4-noded quadrilateral elements were utilized to model coarse aggregate, cement mortar. The effect of aggregate fraction distribution, and pores percent of the total area – resulting from air voids entrapped in concrete during placement on the behavior of plain concrete beam in flexural was detected. Aggregate size fractions were randomly distributed across the profile area of the beam. Extended Finite Element Method (XFEM) was employed to treat the discontinuities problems result from double phases of concrete and cracking that faced during the finite element analysis of concrete beam. Crac
... Show More