The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
High cost of qualifying library standard cells on silicon wafer limits the number of test circuits on the test chip. This paper proposes a technique to share common load circuits among test circuits to reduce the silicon area. By enabling the load sharing, number of transistors for the common load can be reduced significantly. Results show up to 80% reduction in silicon area due to load area reduction.
In this paper, a cognitive system based on a nonlinear neural controller and intelligent algorithm that will guide an autonomous mobile robot during continuous path-tracking and navigate over solid obstacles with avoidance was proposed. The goal of the proposed structure is to plan and track the reference path equation for the autonomous mobile robot in the mining environment to avoid the obstacles and reach to the target position by using intelligent optimization algorithms. Particle Swarm Optimization (PSO) and Artificial Bee Colony (ABC) Algorithms are used to finding the solutions of the mobile robot navigation problems in the mine by searching the optimal paths and finding the reference path equation of the optimal
... Show MoreWaste is one of the most important problems affecting the city’s environment and its urban landscape, which results from the activities and activities of man and the natural environment. Its sources have varied between residential, commercial, industrial, medical and hazardous, and its spread in cities, on roads and on abandoned open lands, has led to significant negative effects and risks to human health and the environment.
Therefore, there were serious attempts to deal with waste and follow sequential steps that formed a waste management system such as (collection, sorting, transport, then treatment and disposal). Preventing and reducing waste, then recycling and recovering by composting or burning, and ending with bu
... Show MoreIn this work, the performance of the receiver in a quantum cryptography system based on BB84 protocol is scaled by calculating the Quantum Bit Error Rate (QBER) of the receiver. To apply this performance test, an optical setup was arranged and a circuit was designed and implemented to calculate the QBER. This electronic circuit is used to calculate the number of counts per second generated by the avalanche photodiodes set in the receiver. The calculated counts per second are used to calculate the QBER for the receiver that gives an indication for the performance of the receiver. Minimum QBER, 6%, was obtained with avalanche photodiode excess voltage equals to 2V and laser diode power of 3.16 nW at avalanche photodiode temperature of -10
... Show MoreA mathematical method with a new algorithm with the aid of Matlab language is proposed to compute the linear equivalence (or the recursion length) of the pseudo-random key-stream periodic sequences using Fourier transform. The proposed method enables the computation of the linear equivalence to determine the degree of the complexity of any binary or real periodic sequences produced from linear or nonlinear key-stream generators. The procedure can be used with comparatively greater computational ease and efficiency. The results of this algorithm are compared with Berlekamp-Massey (BM) method and good results are obtained where the results of the Fourier transform are more accurate than those of (BM) method for computing the linear equivalenc
... Show MoreStandards play a vital role in documenting the values of new test results in the form of tables. They are one of the basic requirements that the standardization process aims for as a complement to standardizing test procedures, and contribute to knowing the current reality of the student. The degree of readiness and level as a result of practicing different exercises for sports activities, in addition to the possibility of adopting it for comparison with his group or similar groups, classification, prediction and selection. Developing the skill of handling the football in the educational field is an important matter for achieving distinguished performance among students. This skill requires a level of accuracy, speed and control, and
... Show MoreCancer disease has a complicated pathophysiology and is one of the major causes of death and morbidity. Classical cancer therapies include chemotherapy, radiation therapy, and immunotherapy. A typical treatment is chemotherapy, which delivers cytotoxic medications to patients to suppress the uncontrolled growth of cancerous cells. Conventional oral medication has a number of drawbacks, including a lack of selectivity, cytotoxicity, and multi-drug resistance, all of which offer significant obstacles to effective cancer treatment. Multidrug resistance (MDR) remains a major challenge for effective cancer chemotherapeutic interventions. The advent of nanotechnology approach has developed the field of tumor diagnosis and treatment. Cancer nanote
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra