Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of coverage. Motivated by such challenge, this paper proposes a novel SCA strategy called Dynamic Event Order (DEO), in which the test case generation is done using one-parameter-at-a-time fashion. The details of the DEO are presented with a step-by-step example to demonstrate the behavior and show the correctness of the proposed strategy. In addition, this paper makes a comparison with existing computational strategies. The practical results demonstrate that the proposed DEO strategy outperforms the existing strategies in term of minimal test size in most cases. Moreover, the significance of the DEO increases as the number of sequences increases and/ or the strength of coverage increases. Furthermore, the proposed DEO strategy succeeds to generate SCAs up to t=7. Finally, the DEO strategy succeeds to find new upper bounds for SCA. In fact, the proposed strategy can act as a research vehicle for variants future implementation.
The investigation of machine learning techniques for addressing missing well-log data has garnered considerable interest recently, especially as the oil and gas sector pursues novel approaches to improve data interpretation and reservoir characterization. Conversely, for wells that have been in operation for several years, conventional measurement techniques frequently encounter challenges related to availability, including the lack of well-log data, cost considerations, and precision issues. This study's objective is to enhance reservoir characterization by automating well-log creation using machine-learning techniques. Among the methods are multi-resolution graph-based clustering and the similarity threshold method. By using cutti
... Show MoreThe present study aims to present a proposed realistic and comprehensive cyber strategy for the Communications Directorate for the next five years (2022-2026) based on the extent of application and documentation of cybersecurity measures in the Directorate and the scientific bases formulating the strategy. The present study is significant in that it provides an accurate diagnosis of the capabilities of the cyber directorate in terms of strengths and weaknesses in its internal environment and the opportunities and threats that surround it in the external environment, based on the results of the assessment of the reality of cybersecurity according to the global Cybersecurity index, which provides a strong basis for building its strategic dire
... Show MoreThe research aims to find a contemporary model in analyzing the reasons behind the delay of the investment plan projects suffered by the North Oil Company. This model is able to understand the environment surrounding the implementation of projects in the light of the changes facing the company at the present time, which in turn requires the need to identify the most important strengths and weaknesses Internal and external opportunities and threats using the SWOT matrix and identify the appropriate strategic alternative based on clear policy, strategies and programs to address weaknesses and look to the future prospects as the company can be stronger and more flexible environmental changes surrounding the reality of implementation
... Show MoreThe performance of a synergistic combination of electrocoagulation (EC) and electro-oxidation (EO) for oilfield wastewater treatment has been studied. The effect of operative variables such as current density, pH, and electrolyte concentration on the reduction of chemical oxygen demand (COD) was studied and optimized based on Response Surface Methodology (RSM). The results showed that the current density had the highest impact on the COD removal with a contribution of 64.07% while pH, NaCl addition and other interactions affects account for only 34.67%. The optimized operating parameters were a current density of 26.77 mA/cm2 and a pH of 7.6 with no addition of NaCl which results in a COD removal efficiency of 93.43% and a specific energy c
... Show MoreA Multiple System Biometric System Based on ECG Data
OpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for ad
... Show MoreSurvival analysis is widely applied to data that described by the length of time until the occurrence of an event under interest such as death or other important events. The purpose of this paper is to use the dynamic methodology which provides a flexible method, especially in the analysis of discrete survival time, to estimate the effect of covariate variables through time in the survival analysis on dialysis patients with kidney failure until death occurs. Where the estimations process is completely based on the Bayes approach by using two estimation methods: the maximum A Posterior (MAP) involved with Iteratively Weighted Kalman Filter Smoothing (IWKFS) and in combination with the Expectation Maximization (EM) algorithm. While the other
... Show MoreAn experiment in the semester, the second semester of the academic year (2022-2023), and the data used was not processed (the second test for two independent, inaccurate samples, the Bermon correlation coefficient, and the Spearman correlation coefficient), and the following results were reached: There is a statistically significant difference at the level of ( 0) average, 05) between the third grade who studied the plan for asking cluster questions, and between the average of those who studied the special feature according to the traditional method of selecting achievement, and enjoyed completing the specialization, choosing the experimental group, because the strategy of asking cluster questions is one of the externalities that... Lear
... Show More