The petrophysical analysis is very important to understand the factors controlling the reservoir quality and production wells. In the current study, the petrophysical evaluation was accomplished to hydrocarbon assessment based on well log data of four wells of Early Cretaceous carbonate reservoir Yamama Formation in Abu-Amood oil field in the southern part of Iraq. The available well logs such as sonic, density, neutron, gamma ray, SP, and resistivity logs for wells AAm-1, AAm-2, AAm-3, and AAm-5 were used to delineate the reservoir characteristics of the Yamama Formation. Lithologic and mineralogic studies were performed using porosity logs combination cross plots such as density vs. neutron cross plot and M-N mineralogy plot. These cross plots show that the Yamama Formation consists mainly of limestone and the essential mineral components are dominantly calcite with small amounts of dolomite. The petrophysical characteristics such as porosity, water and hydrocarbon saturation and bulk water volume were determined and interpreted using Techlog software to carried out and building the full computer processed interpretation for reservoir properties. Based on the petrophysical properties of studied wells, the Yamama Formation is divided into six units; (YB-1, YB-2, YB-3, YC-1, YC-2 and YC-3) separated by dense non porous units (Barrier beds). The units (YB-1, YB-2, YC-2 and YC-3) represent the most important reservoir units and oil-bearing zones because these reservoir units are characterized by good petrophysical properties due to high porosity and low to moderate water saturation. The other units are not reservoirs and not oil-bearing units due to low porosity and high-water saturation.
Lattakia city faces many problems related to the mismanagement of solid waste, as the disposal process is limited to the random Al-Bassa landfill without treatment. Therefore, solid waste management poses a special challenge to decision-makers by choosing the appropriate tool that supports strategic decisions in choosing municipal solid waste treatment methods and evaluating their management systems. As the human is primarily responsible for the formation of waste, this study aims to measure the degree of environmental awareness in the Lattakia Governorate from the point of view of the research sample members and to discuss the effect of the studied variables (place of residence, educational level, gender, age, and professional status) o
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreIn this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreThe multicast technology implements a very high-efficiency point-to-multipoint data transmission over IP networks (IPv4 and IPv6). Multicast reduces network load, eliminates traffic redundancy, and saves network bandwidth. Therefore, multicast used widely in LAN/WAN applications such as online games, video conferencing and IPTV. The multicast technology implements varied protocols such as DVMRP(Distance Vector Multicast Routing Protocol), MOSPF(Multicast Open Shortest Path First), or PIM-DM (Protocol Independent Multicast- Dense Mode) which considered source tree type, while PIM-SM (Protocol Independent Multicast- Sparse Mode) and CBT (Core Based Tree) uses shared tree. Current paper focuses on the performance evaluation of the two multi
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreNetwork security is defined as a set of policies and actions taken by a network administrator in order to prevent unauthorized access, penetrated the defenses and infiltrated the network from unnecessary intervention. The network security also involves granting access to data using a pre-defined policy. A network firewall, on the other hand, is a network appliance that controls incoming and outgoing traffic by examining the traffic flowing through the network. This security measure establishes a secure wall [firewall] between a trusted internal network and the outside world were a security threat in shape of a hacker or a virus might have existed
A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show More