The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn order to save natural resources, recycling necessarily becomes a top priority for all of us, to save exhaustible resources, produce green energy and preserve the environment.
In this perspective, we are trying to valorize a waste of animal origin, largely neglected by the actors of materials, through an industrial transformation into a biological charge to make new sustainable bio-composite materials.
Using a tensile test bench, we try to mechanically characterize this biomaterial of renewable resources that, unlike eco-composites, has been neglected by the material actors.
Obtained from waste, with a high recycling potential and from renewable resources, the bio-charge to be analyzed will be injected, later in different poly
Soil water retention curves (SWRCs) are crucial for characterizing soil moisture dynamics and are particularly relevant in the context of irrigation management. A study was carried out to obtain the SWRC, inflection point, S index, pore size distribution curve, macro porosity, and air capacity from samples submitted to saturation and re-saturation processes. Five different-texture disturbed soil samples Sandy Loam, Loam, Sandy Clay Loam, Silt Loam, and Clay were collected. After obtaining SWRC, each air-dried soil samples were submitted to particle size distribution and clay dispersed in water analyses to verify the soil lost clay. The experimental design was completely randomized with three replications using two processes of SWRC (saturat
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreAn analytical model in the form of a hyperbolic function has been suggested for the axial potential distribution of an electrostatic einzel lens. With the aid of this hyperbolic model the relative optical parameters have been computed and investigated in detail as a function of the electrodes voltage ratio for various trajectories of an accelerated charged-particles beam. The electrodes voltage ratio covered a wide range where the lens may be operated at accelerating and decelerating modes. The results have shown that the proposed hyperbolic field has the advantages of producing low aberrations under various magnification conditions and operational modes. The electrodes profile and their three-dimensional diagram have been determined whi
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreThe advancements in Information and Communication Technology (ICT), within the previous decades, has significantly changed people’s transmit or store their information over the Internet or networks. So, one of the main challenges is to keep these information safe against attacks. Many researchers and institutions realized the importance and benefits of cryptography in achieving the efficiency and effectiveness of various aspects of secure communication.This work adopts a novel technique for secure data cryptosystem based on chaos theory. The proposed algorithm generate 2-Dimensional key matrix having the same dimensions of the original image that includes random numbers obtained from the 1-Dimensional logistic chaotic map for given con
... Show MoreThe Cu(II) was found using a quick and uncomplicated procedure that involved reacting it with a freshly synthesized ligand to create an orange complex that had an absorbance peak of 481.5 nm in an acidic solution. The best conditions for the formation of the complex were studied from the concentration of the ligand, medium, the eff ect of the addition sequence, the eff ect of temperature, and the time of complex formation. The results obtained are scatter plot extending from 0.1–9 ppm and a linear range from 0.1–7 ppm. Relative standard deviation (RSD%) for n = 8 is less than 0.5, recovery % (R%) within acceptable values, correlation coeffi cient (r) equal 0.9986, coeffi cient of determination (r2) equal to 0.9973, and percentage capita
... Show More