The optimization of artificial gas lift techniques plays a crucial role in the advancement of oil field development. This study focuses on investigating the impact of gas lift design and optimization on production outcomes within the Mishrif formation of the Halfaya oil field. A comprehensive production network nodal analysis model was formulated using a PIPESIM Optimizer-based Genetic Algorithm and meticulously calibrated utilizing field-collected data from a network comprising seven wells. This well group encompasses three directional wells currently employing gas lift and four naturally producing vertical wells. To augment productivity and optimize network performance, a novel gas lift design strategy was proposed. The optimization of gas allocation was executed to maximize oil production rates while minimizing the injected gas volume, thus achieving optimal oil production levels at the most effective gas injection volume for the designated network. The utilization of the PIPESIM Optimizer, founded on genetic algorithm principles, facilitated the attainment of these optimal parameters. The culmination of this study yielded an optimal oil production rate of 18,814 STB/d, accompanied by a gas lift injection rate of 7.56 MMscf/d. This research underscores the significance of strategic gas lift design and optimization in enhancing oil recovery and operational efficiency in complex reservoir systems like the Mishrif formation within the Halfaya oil field.
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreSoftware testing is a vital part of the software development life cycle. In many cases, the system under test has more than one input making the testing efforts for every exhaustive combination impossible (i.e. the time of execution of the test case can be outrageously long). Combinatorial testing offers an alternative to exhaustive testing via considering the interaction of input values for every t-way combination between parameters. Combinatorial testing can be divided into three types which are uniform strength interaction, variable strength interaction and input-output based relation (IOR). IOR combinatorial testing only tests for the important combinations selected by the tester. Most of the researches in combinatorial testing appli
... Show MoreChurning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show MoreComputer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the bes
... Show MoreNowadays, it is convenient for us to use a search engine to get our needed information. But sometimes it will misunderstand the information because of the different media reports. The Recommender System (RS) is popular to use for every business since it can provide information for users that will attract more revenues for companies. But also, sometimes the system will recommend unneeded information for users. Because of this, this paper provided an architecture of a recommender system that could base on user-oriented preference. This system is called UOP-RS. To make the UOP-RS significantly, this paper focused on movie theatre information and collect the movie database from the IMDb website that provides informatio
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
tock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.
This work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
People’s ability to quickly convey their thoughts, or opinions, on various services or items has improved as Web 2.0 has evolved. This is to look at the public perceptions expressed in the reviews. Aspect-based sentiment analysis (ABSA) deemed to receive a set of texts (e.g., product reviews or online reviews) and identify the opinion-target (aspect) within each review. Contemporary aspect-based sentiment analysis systems, like the aspect categorization, rely predominantly on lexicon-based, or manually labelled seeds that is being incorporated into the topic models. And using either handcrafted rules or pre-labelled clues for performing implicit aspect detection. These constraints are restricted to a particular domain or language which is
... Show More