The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
Linguistic research according to modern curricula:
It is one of the important matters that occupy the ideas of those concerned with linguistic studies, whether Arabic or otherwise. Recent years have witnessed the advancement of this methodological approach, and books and studies in Arabic have been written on important, multifaceted issues, of grammatical and linguistic origins, and their balance with new developments and ideas attracted mostly from Western studies.
The comparative approach - as they call it - is one of the modern approaches that is based on balancing a language with other sisters belonging to its family, to reach similarities and differences between them, and to know the c
In this research a study of the effect of quality, sequential and directional layers for three types of fibers are:(Kevlar fibers-49 woven roving and E- glass fiber woven roving and random) on the fatigue property using epoxy as matrix. The test specimens were prepared by hand lay-up method the epoxy resin used as a matrix type (Quick mast 105) in prepared material composit . Sinusoidal wave which is formed of variable stress amplitudes at 15 Hz cycles was employed in the fatigue test ( 10 mm )and (15mm) value 0f deflection arrival to numbers of cycle failure limit, by rotary bending method by ( S-N) curves this curves has been determined ( life , limit and fa
... Show MoreGas-lift technique plays an important role in sustaining oil production, especially from a mature field when the reservoirs’ natural energy becomes insufficient. However, optimally allocation of the gas injection rate in a large field through its gas-lift network system towards maximization of oil production rate is a challenging task. The conventional gas-lift optimization problems may become inefficient and incapable of modelling the gas-lift optimization in a large network system with problems associated with multi-objective, multi-constrained, and limited gas injection rate. The key objective of this study is to assess the feasibility of utilizing the Genetic Algorithm (GA) technique to optimize t
A substantial portion of today’s multimedia data exists in the form of unstructured text. However, the unstructured nature of text poses a significant task in meeting users’ information requirements. Text classification (TC) has been extensively employed in text mining to facilitate multimedia data processing. However, accurately categorizing texts becomes challenging due to the increasing presence of non-informative features within the corpus. Several reviews on TC, encompassing various feature selection (FS) approaches to eliminate non-informative features, have been previously published. However, these reviews do not adequately cover the recently explored approaches to TC problem-solving utilizing FS, such as optimization techniques.
... Show MoreVoice denoising is the process of removing undesirable voices from the voice signal. Within the environmental noise and after the application of speech recognition system, the discriminative model finds it difficult to recognize the waveform of the voice signal. This is due to the fact that the environmental noise needs to use a suitable filter that does not affect the shaped waveform of the input microphone. This paper plans to build up a procedure for a discriminative model, using infinite impulse response filter (Butterworth filter) and local polynomial approximation (Savitzky-Golay) smoothing filter that is a polynomial regression on the signal values. Signal to noise ratio (SNR) was calculated after filtering to compare the results
... Show MoreAdvances in gamma imaging technology mean that is now technologically feasible to conduct stereoscopic gamma imaging in a hand-held unit. This paper derives an analytical model for stereoscopic pinhole imaging which can be used to predict performance for a wide range of camera configurations. Investigation of this concept through Monte Carlo and benchtop studies, for an example configuration, shows camera-source distance measurements with a mean deviation between calculated and actual distances of <5 mm for imaging distances of 50–250 mm. By combining this technique with stereoscopic optical imaging, we are then able to calculate the depth of a radioisotope source beneath a surfa
Mining association rules is a popular and well-studied method of data mining tasks whose primary aim is the discovers of the correlation among sets of items in the transactional databases. However, generating high- quality association rules in a reasonable time from a given database has been considered as an important and challenging problem, especially with the fast increasing in database's size. Many algorithms for association rules mining have been already proposed with promosing results. In this paper, a new association rules mining algorithm based on Bees Swarm Optimization metaheuristic named Modified Bees Swarm Optimization for Association Rules Mining (MBSO-ARM) algorithm is proposed. Results show that the proposed algorithm can
... Show MoreThis paper proposes a new strategy to enhance the performance and accuracy of the Spiral dynamic algorithm (SDA) for use in solving real-world problems by hybridizing the SDA with the Bacterial Foraging optimization algorithm (BFA). The dynamic step size of SDA makes it a useful exploitation approach. However, it has limited exploration throughout the diversification phase, which results in getting trapped at local optima. The optimal initialization position for the SDA algorithm has been determined with the help of the chemotactic strategy of the BFA optimization algorithm, which has been utilized to improve the exploration approach of the SDA. The proposed Hybrid Adaptive Spiral Dynamic Bacterial Foraging (HASDBF)
... Show More