The application of the test case prioritization method is a key part of system testing intended to think it through and sort out the issues early in the development stage. Traditional prioritization techniques frequently fail to take into account the complexities of big-scale test suites, growing systems and time constraints, therefore cannot fully fix this problem. The proposed study here will deal with a meta-heuristic hybrid method that focuses on addressing the challenges of the modern time. The strategy utilizes genetic algorithms alongside a black hole as a means to create a smooth tradeoff between exploring numerous possibilities and exploiting the best one. The proposed hybrid algorithm of genetic black hole (HGBH) uses the capabilities of considering the imperatives such as code coverage, fault finding rate and execution time from search algorithms in our hybrid approach to refine test cases considerations repetitively. The strategy accomplished this by putting experiments on a large-scale project of industrial software developed. The hybrid meta-heuristic technique ends up being better than the routine techniques. It helps in higher code coverage, which, in turn, enables to detect crucial defects at an early stage and also to allocate the testing resources in a better way. In particular, the best APFD value was 0.9321, which was achieved in 6 generations with 4.879 seconds the value to which the computer was run. Besides these, , the approach resulted in the mean value of APFD as 0.9247 and 0.9302 seconds which took from 10.509 seconds to 30.372 seconds. The carried out experiment proves the feasibility of this approach in implementing complex systems and consistently detecting the changes, enabling it to adapt to rapidly changing systems. In the end, this research provides us with a new hybrid meta-heuristic way of test case prioritization and optimization, which, in turn, helps to tackle the obstacles caused by large-scale test cases and constantly changing systems.
Bark fiber has high potential use for composite reinforcement in biocomposite material. The aim of this study is the mechanical properties of Bark fiber reinforced polester composite with varying fiber weight fraction (0% , 5% , 10% , 20%, 30% and 40%) hand lay-up technique which was used to prepare the composite , specimens for tensile , flexural and impact test according to the ASTM D638 , ASTMD790 , and Iso-179. The over all results showed that the composite is reinforced with Bark fiber at weight (10%) higher mechanical properties , and the composite showed improved mechanical (Flexural).
In this paper we proposed a new method for selecting a smoothing parameter in kernel estimator to estimate a nonparametric regression function in the presence of missing values. The proposed method is based on work on the golden ratio and Surah AL-E-Imran in the Qur'an. Simulation experiments were conducted to study a small sample behavior. The results proved the superiority the proposed on the competition method for selecting smoothing parameter.
Water saturation is the most significant characteristic for reservoir characterization in order to assess oil reserves; this paper reviewed the concepts and applications of both classic and new approaches to determine water saturation. so, this work guides the reader to realize and distinguish between various strategies to obtain an appropriate water saturation value from electrical logging in both resistivity and dielectric has been studied, and the most well-known models in clean and shaly formation have been demonstrated. The Nuclear Magnetic Resonance in conventional and nonconventional reservoirs has been reviewed and understood as the major feature of this approach to estimate Water Saturation based on T2 distribution. Artific
... Show MoreThis paper is Interested with studying the performance of statistic test the hypothesis of independence of the two variables (the hypothesis that there is no correlation between the variables under study) in the case of data to meet the requirement of normal distribution in the case away from the distribution due to the presence of outliers (contaminated values) and compared with the performance of some of the other methods proposed and modified
Long memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreIn this study, He's parallel numerical algorithm by neural network is applied to type of integration of fractional equations is Abel’s integral equations of the 1st and 2nd kinds. Using a Levenberge – Marquaradt training algorithm as a tool to train the network. To show the efficiency of the method, some type of Abel’s integral equations is solved as numerical examples. Numerical results show that the new method is very efficient problems with high accuracy.
In this paper, a mathematical model was built for the supply chain to reduce production, inventory, and transportation in Baghdad Company for Soft Drink. The linear programming method was used to solve this mathematical model. We reduced the cost of production by reduced the daily work hours, the company do not need the overtime hours to work at the same levels of production, and the costs of storage in the company's warehouses and agents' stores have been reduced by making use of the stock correctly, which guarantees reducing costs and preserving products from damage. The units transferred from the company were equal to the units demanded by the agents. The company's mathematical model also achieved profits by (84,663,769) by re
... Show MoreThe using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible models of parametric models and these models were nonparametric models.
In this manuscript were compared to the so-called Nadaraya-Watson estimator in two cases (use of fixed bandwidth and variable) through simulation with different models and samples sizes. Through simulation experiments and the results showed that for the first and second models preferred NW with fixed bandwidth fo
... Show More