This study aims to evaluate reservoir characteristics of Hartha Formation in Majnoon oil field based on well logs data for three wells (Mj-1, Mj-3 and Mj-11). Log interpretation was carried out by using a full set of logs to calculate main petrophysical properties such as effective porosity and water saturation, as well as to find the volume of shale. The evaluation of the formation included computer processes interpretation (CPI) using Interactive Petrophysics (IP) software. Based on the results of CPI, Hartha Formation is divided into five reservoir units (A1, A2, A3, B1, B2), deposited in a ramp setting. Facies associations is added to well logs interpretation of Hartha Formation, and was inferred by a microfacies analysis of th
... Show MoreThis paper presents a hybrid energy resources (HER) system consisting of solar PV, storage, and utility grid. It is a challenge in real time to extract maximum power point (MPP) from the PV solar under variations of the irradiance strength. This work addresses challenges in identifying global MPP, dynamic algorithm behavior, tracking speed, adaptability to changing conditions, and accuracy. Shallow Neural Networks using the deep learning NARMA-L2 controller have been proposed. It is modeled to predict the reference voltage under different irradiance. The dynamic PV solar and nonlinearity have been trained to track the maximum power drawn from the PV solar systems in real time.
Moreover, the proposed controller i
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreCryptography can be thought of as a toolbox, where potential attackers gain access to various computing resources and technologies to try to compute key values. In modern cryptography, the strength of the encryption algorithm is only determined by the size of the key. Therefore, our goal is to create a strong key value that has a minimum bit length that will be useful in light encryption. Using elliptic curve cryptography (ECC) with Rubik's cube and image density, the image colors are combined and distorted, and by using the Chaotic Logistics Map and Image Density with a secret key, the Rubik's cubes for the image are encrypted, obtaining a secure image against attacks. ECC itself is a powerful algorithm that generates a pair of p
... Show MoreIn this paper a refractive index sensor based on micro-structured optical fiber has been proposed using Finite Element Method (FEM). The designed fiber has a hexagonal cladding structure with six air holes rings running around its solid core. The air holes of fiber has been infiltrated with different liquids such as water , ethanol, methanol, and toluene then sensor characteristics like ; effective refractive index , confinement loss, beam profile of the fundamental mode, and sensor resolution are investigated by employing the FEM. This designed sensor characterized by its low confinement loss and high resolution so a small change in the analyte refractive index could be detect which is could be useful to detect the change of
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreMost of today’s techniques encrypt all of the image data, which consumes a tremendous amount of time and computational payload. This work introduces a selective image encryption technique that encrypts predetermined bulks of the original image data in order to reduce the encryption/decryption time and the
computational complexity of processing the huge image data. This technique is applying a compression algorithm based on Discrete Cosine Transform (DCT). Two approaches are implemented based on color space conversion as a preprocessing for the compression phases YCbCr and RGB, where the resultant compressed sequence is selectively encrypted using randomly generated combined secret key.
The results showed a significant reduct
The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T
... Show More<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show More