Experimental activity coefficients at infinite dilution are particularly useful for calculating the parameters needed in an expression for the excess Gibbs energy. If reliable values of γ∞1 and γ∞2 are available, either from direct experiment or from a correlation, it is possible to predict the composition of the azeotrope and vapor-liquid equilibrium over the entire range of composition. These can be used to evaluate two adjustable constants in any desired expression for G E. In this study MOSCED model and SPACE model are two different methods were used to calculate γ∞1 and γ∞2
Background: Type 2 diabetes mellitus (T2DM) characterized by insulin resistance (IR) and progressive decline in functional beta (β) cell mass partially due to increased β cell apoptosis rate. Pancreatic stone protein /regenerating protein (PSP/reg) is produced mainly by the pancreas and elevated drastically during pancreatic disorder. Beta cells are experiencing apoptosis that stimulate the expression of PSP/reg gene in surviving neighboring cells, and that PSP/reg protein is subsequently secreted from these cells which could play a role in their regeneration.
Objectives: To analyze serum levels of PSP/reg protein in T2DM patients and evaluate its correlation with the microvasc
... Show MoreThe research utilizes data produced by the Local Urban Management Directorate in Najaf and the imagery data from the Landsat 9 satellite, after being processed by the GIS tool. The research follows a descriptive and analytical approach; we integrated the Markov chain analysis and the cellular automation approach to predict transformations in city structure as a result of changes in land utilization. The research also aims to identify approaches to detect post-classification transformations in order to determine changes in land utilization. To predict the future land utilization in the city of Kufa, and to evaluate data accuracy, we used the Kappa Indicator to determine the potential applicability of the probability matrix that resulted from
... Show MoreEmpirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, F
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreUnderwater Wireless Sensor Networks (UWSNs) have emerged as a promising technology for a wide range of ocean monitoring applications. The UWSNs suffer from unique challenges of the underwater environment, such as dynamic and sparse network topology, which can easily lead to a partitioned network. This results in hotspot formation and the absence of the routing path from the source to the destination. Therefore, to optimize the network lifetime and limit the possibility of hotspot formation along the data transmission path, the need to plan a traffic-aware protocol is raised. In this research, we propose a traffic-aware routing protocol called PG-RES, which is predicated on the ideas of Pressure Gradient and RESistance concept. The proposed
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreToday, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.