Simulation of the Linguistic Fuzzy Trust Model (LFTM) over oscillating Wireless Sensor Networks (WSNs) where the goodness of the servers belonging to them could change along the time is presented in this paper, and the comparison between the outcomes achieved with LFTM model over oscillating WSNs with the outcomes obtained by applying the model over static WSNs where the servers maintaining always the same goodness, in terms of the selection percentage of trustworthy servers (the accuracy of the model) and the average path length are also presented here. Also in this paper the comparison between the LFTM and the Bio-inspired Trust and Reputation Model for Wireless Sensor Networks (BTRM-WSN) in terms of the accuracy and the average path length suggested by each model is presented. Both models give quite good and accurate outcomes over oscillating WSNs. Also it must be mentioned that the evaluation environment used here is Trust and Reputation Model Simulator for WSN.
The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2,0,0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlation coefficien
... Show MoreThe need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show MoreIntroduction: Although soap industry is known from hundreds of years, the development accompanied with this industry was little. The development implied the mechanical equipment and the additive materials necessary to produce soap with the best specifications of shape, physical and chemical properties. Objectives: This research studies the use of vacuum reactive distillation VRD technique for soap production. Methods: Olein and Palmitin in the ratio of 3 to 1 were mixed in a flask with NaOH solution in stoichiometric amount under different vacuum pressures from -0.35 to -0.5 bar. Total conversion was reached by using the VRD technique. The soap produced by the VRD method was compared with soap prepared by the reaction - only method which
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreThe paper generates a geological model of a giant Middle East oil reservoir, the model constructed based on the field data of 161 wells. The main aim of the paper was to recognize the value of the reservoir to investigate the feasibility of working on the reservoir modeling prior to the final decision of the investment for further development of this oilfield. Well log, deviation survey, 2D/3D interpreted seismic structural maps, facies, and core test were utilized to construct the developed geological model based on comprehensive interpretation and correlation processes using the PETREL platform. The geological model mainly aims to estimate stock-tank oil initially in place of the reservoir. In addition, three scenarios were applie
... Show MoreMany problems were encountered during the drilling operations in Zubair oilfield. Stuckpipe, wellbore instability, breakouts and washouts, which increased the critical limits problems, were observed in many wells in this field, therefore an extra non-productive time added to the total drilling time, which will lead to an extra cost spent. A 1D Mechanical Earth Model (1D MEM) was built to suggest many solutions to such types of problems. An overpressured zone is noticed and an alternative mud weigh window is predicted depending on the results of the 1D MEM. Results of this study are diagnosed and wellbore instability problems are predicted in an efficient way using the 1D MEM. Suitable alternative solutions are presented
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreSensibly highlighting the hidden structures of many real-world networks has attracted growing interest and triggered a vast array of techniques on what is called nowadays community detection (CD) problem. Non-deterministic metaheuristics are proved to competitively transcending the limits of the counterpart deterministic heuristics in solving community detection problem. Despite the increasing interest, most of the existing metaheuristic based community detection (MCD) algorithms reflect one traditional language. Generally, they tend to explicitly project some features of real communities into different definitions of single or multi-objective optimization functions. The design of other operators, however, remains canonical lacking any inte
... Show More