The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
This article presents a polynomial-based image compression scheme, which consists of using the color model (YUV) to represent color contents and using two-dimensional polynomial coding (first-order) with variable block size according to correlation between neighbor pixels. The residual part of the polynomial for all bands is analyzed into two parts, most important (big) part, and least important (small) parts. Due to the significant subjective importance of the big group; lossless compression (based on Run-Length spatial coding) is used to represent it. Furthermore, a lossy compression system scheme is utilized to approximately represent the small group; it is based on an error-limited adaptive coding system and using the transform codin
... Show MoreRealistic implementation of nanofluids in subsurface projects including carbon geosequestration and enhanced oil recovery requires full understanding of nanoparticles (NPs) adsorption behaviour in the porous media. The physicochemical interactions between NPs and between the NP and the porous media grain surface control the adsorption behavior of NPs. This study investigates the reversible and irreversible adsorption of silica NPs onto oil-wet and water-wet carbonate surfaces at reservoir conditions. Each carbonate sample was treated with different concentrations of silica nanofluid to investigate NP adsorption in terms of nanoparticles initial size and hydrophobicity at different temperatures, and pressures. Aggregation behaviour and the
... Show MoreWith the freedom offered by the Deep Web, people have the opportunity to express themselves freely and discretely, and sadly, this is one of the reasons why people carry out illicit activities there. In this work, a novel dataset for Dark Web active domains known as crawler-DB is presented. To build the crawler-DB, the Onion Routing Network (Tor) was sampled, and then a web crawler capable of crawling into links was built. The link addresses that are gathered by the crawler are then classified automatically into five classes. The algorithm built in this study demonstrated good performance as it achieved an accuracy of 85%. A popular text representation method was used with the proposed crawler-DB crossed by two different supervise
... Show MoreData transmission in public communication system is not safe since of interception and improper manipulation by attacker. So, the attractive solution for these problems is to design high secure system that reduce the ability of attacker from getting sensitive information such as (account ID, passwords, etc.). The best way is combine two high secure techniques: steganography technique, which is the method of hiding any secret information like data, password and image behind any cover file and cryptography, which is convert the data to unreadable data. This paper suggests a crypto-stego authentication method to provide a highly secured authentication. The proposed method is utilized audio steganography and AES Cryp
... Show MoreWe are used Bayes estimators for unknown scale parameter when shape Parameter is known of Erlang distribution. Assuming different informative priors for unknown scale parameter. We derived The posterior density with posterior mean and posterior variance using different informative priors for unknown scale parameter which are the inverse exponential distribution, the inverse chi-square distribution, the inverse Gamma distribution, and the standard Levy distribution as prior. And we derived Bayes estimators based on the general entropy loss function (GELF) is used the Simulation method to obtain the results. we generated different cases for the parameters of the Erlang model, for different sample sizes. The estimates have been comp
... Show MoreIn light of increasing demand for energy consumption due to life complexity and its requirements, which reflected on architecture in type and size, Environmental challenges have emerged in the need to reduce emissions and power consumption within the construction sector. Which urged designers to improve the environmental performance of buildings by adopting new design approaches, Invest digital technology to facilitate design decision-making, in short time, effort and cost. Which doesn’t stop at the limits of acceptable efficiency, but extends to the level of (the highest performance), which doesn’t provide by traditional approaches that adopted by researchers and local institutions in their studies and architectural practices, limit
... Show More