Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in case of the unavailability of old cores to carry out liquid permeability. Moreover, the conversion formula offers a better use of the large amount of old air permeability data obtained through routine core analysis for the further uses in reservoir and geological modeling studies.
The comparison analysis shows high accuracy and more consistent results over a wide range of permeability values for the suggested conversion formula.
The chromatographic behaviour of liquid crystalline compounds benzylidene-p-aminobenzoic acid and 4-(p-methyl benzylidene)-p-aminobenzoic acid as stationary phases for the separation of dimethylphenol isomers was investigated. These isomers were analysed on benzylidene-p-aminobenzoic acid within a nematic range of 169-194 ◦C with a temperature interval of 5 ◦C. Better peak resolution was at a column temperature of 190 ◦C. The analysis was repeated on a 4-(p-methyl benzylidene)-p-aminobenzoic acid column at a nematic temperature of 256 ◦C, which represented the end of the nematic range, and gave the optimum peak resolution. It was found that isomer better separation was obtained at 20% loading for both liquid crystal materials. Other
... Show MoreIn the hybrid coolingsolar systems , a solar collectoris used to convertsolar energy intoheat sourcein order to super heat therefrigerant leave thecompressor,andthisprocess helpsin the transformation ofrefrigerant state from gaseous statetothe liquid statein upper two-thirdsof thecondenserinstead of the lower two-thirdssuchas in thetraditional air-conditioning systems and this willreduce theenergyneeded torun the process ofcooling.In this research two hybrid air-conditioning system with an evacuated tube solar collector were used, therefrigerant was R22 and the capacity was 2 tons each.The tilt angle of the evacuated tube solar collector was changed and the solar collector fluid was replaced into oil instead of water.A comparison wasi
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MoreAbstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreThe Machine learning methods, which are one of the most important branches of promising artificial intelligence, have great importance in all sciences such as engineering, medical, and also recently involved widely in statistical sciences and its various branches, including analysis of survival, as it can be considered a new branch used to estimate the survival and was parallel with parametric, nonparametric and semi-parametric methods that are widely used to estimate survival in statistical research. In this paper, the estimate of survival based on medical images of patients with breast cancer who receive their treatment in Iraqi hospitals was discussed. Three algorithms for feature extraction were explained: The first principal compone
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreWe studied the effect of Ca- doping on the properties of Bi-based superconductors by
adding differ ent amounts of CaO
to the Bi
2
Sr2La2-xCaxCu3O10+δ
compound. consequently, we
obtained three samples A,B and C with x=0.0, 0.4 and 0.8 respectively. The usual solid-state
reaction method has been applied under optimum conditions. The x-ray diffraction analy sis
showed that the samples A and B have tetragonal structures conversely the sample C has an
orthorhombic structure. In addition XRD analysis show that decreasing the c-axis lattice
constant and thus decreasing the ratio c/a for samples A,B and C resp ectively. The X-ray
florescence proved that the compositions of samples A,B and C with the ra
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show More