Experimental activity coefficients at infinite dilution are particularly useful for calculating the parameters needed in an expression for the excess Gibbs energy. If reliable values of γ∞1 and γ∞2 are available, either from direct experiment or from a correlation, it is possible to predict the composition of the azeotrope and vapor-liquid equilibrium over the entire range of composition. These can be used to evaluate two adjustable constants in any desired expression for G E. In this study MOSCED model and SPACE model are two different methods were used to calculate γ∞1 and γ∞2
The communication inspiration formed an essential foundations for contribute the influence individuals and recipients, whether negatively or positively, through the messages that were published and presented in them with multiple themes and viewpoints that covered all parts of the world and all age groups; it is directed to children addressing the various stages of childhood, as it simulates many goals, including what is directed through the digital use of educational data in television production, as it is considered an intellectual and mental bag to deliver ideas and expressive and aesthetic connotations to children, where the songs and cartoons carrying data on education; within adjacent relations and in a mutual direction, both of th
... Show MoreData mining is one of the most popular analysis methods in medical research. It involves finding patterns and correlations in previously unknown datasets. Data mining encompasses various areas of biomedical research, including data collection, clinical decision support, illness or safety monitoring, public health, and inquiry research. Health analytics frequently uses computational methods for data mining, such as clustering, classification, and regression. Studies of large numbers of diverse heterogeneous documents, including biological and electronic information, provided extensive material to medical and health studies.
Characterization of the heterogonous reservoir is complex representation and evaluation of petrophysical properties and application of the relationships between porosity-permeability within the framework of hydraulic flow units is used to estimate permeability in un-cored wells. Techniques of flow unit or hydraulic flow unit (HFU) divided the reservoir into zones laterally and vertically which can be managed and control fluid flow within flow unit and considerably is entirely different with other flow units through reservoir. Each flow unit can be distinguished by applying the relationships of flow zone indicator (FZI) method. Supporting the relationship between porosity and permeability by using flow zone indictor is ca
... Show MoreThe purpose of this research is to find the estimator of the average proportion of defectives based on attribute samples. That have been curtailed either with rejection of a lot finding the kth defective or with acceptance on finding the kth non defective.
The MLE (Maximum likelihood estimator) is derived. And also the ASN in Single Curtailed Sampling has been derived and we obtain a simplified Formula All the Notations needed are explained.
This paper deals with finding the approximation solution of a nonlinear parabolic boundary value problem (NLPBVP) by using the Galekin finite element method (GFEM) in space and Crank Nicolson (CN) scheme in time, the problem then reduce to solve a Galerkin nonlinear algebraic system(GNLAS). The predictor and the corrector technique (PCT) is applied here to solve the GNLAS, by transforms it to a Galerkin linear algebraic system (GLAS). This GLAS is solved once using the Cholesky method (CHM) as it appear in the matlab package and once again using the Cholesky reduction order technique (CHROT) which we employ it here to save a massive time. The results, for CHROT are given by tables and figures and show
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreThis paper considers the maximum number of weekly cases and deaths caused by the COVID-19 pandemic in Iraq from its outbreak in February 2020 until the first of July 2022. Some probability distributions were fitted to the data. Maximum likelihood estimates were obtained and the goodness of fit tests were performed. Results revealed that the maximum weekly cases were best fitted by the Dagum distribution, which was accepted by three goodness of fit tests. The generalized Pareto distribution best fitted the maximum weekly deaths, which was also accepted by the goodness of fit tests. The statistical analysis was carried out using the Easy-Fit software and Microsoft Excel 2019.
Secure data communication across networks is always threatened with intrusion and abuse. Network Intrusion Detection System (IDS) is a valuable tool for in-depth defense of computer networks. Most research and applications in the field of intrusion detection systems was built based on analysing the several datasets that contain the attacks types using the classification of batch learning machine. The present study presents the intrusion detection system based on Data Stream Classification. Several data stream algorithms were applied on CICIDS2017 datasets which contain several new types of attacks. The results were evaluated to choose the best algorithm that satisfies high accuracy and low computation time.
In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show More