Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls short. The current research is motivated by this concept and proposes a multifactor algorithm incorporated with genetic operators and powerful features. A factor-based prioritizer is introduced for proper handling of tied test cases that emerged while implementing re-ordering. Besides this, a Cost-based Fine Tuner (CFT) is embedded in the study to reveal the stable test cases for processing. The effectiveness of the outcome procured through the proposed minimization approach is anatomized and compared with a specific heuristic method (rule-based) and standard genetic methodology. Intra-validation for the result achieved from the reduction procedure is performed graphically. This study contrasts randomly generated sequences with procured re-ordered test sequence for over '10' benchmark codes for the proposed prioritization scheme. Experimental analysis divulged that the proposed system significantly managed to achieve a reduction of 35-40% in testing effort by identifying and executing stable and coverage efficacious test cases at an earlier phase.
In this research weights, which are used, are estimated using General Least Square Estimation to estimate simple linear regression parameters when the depended variable, which is used, consists of two classes attributes variable (for Heteroscedastic problem) depending on Sequential Bayesian Approach instead of the Classical approach used before, Bayes approach provides the mechanism of tackling observations one by one in a sequential way, i .e each new observation will add a new piece of information for estimating the parameter of probability estimation of certain phenomenon of Bernoulli trials who research the depended variable in simple regression linear equation. in addition to the information deduced from the past exper
... Show MoreThis research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show MoreThis study aims to derive a general relation between line loads that acting on two-way slab system and the equivalent uniformly distributed loads. This relation will be so useful to structural designer that are used to working with a uniformly distributed load and enable them to use the traditional methods for analysis of two-way systems (e.g. Direct Design Method). Two types of slab systems, Slab System with Beams and Flat Slab Systems, have been considered in this study to include the effect of aspect ratio and type of slab on the proposed relation. Five aspect ratios, l2/l1 of 0.5, 0.75, 1.0, 1.5 and 2.0, have been considered for both types of two-way systems.
All necessary finite element analyses have been executed with SAFE Soft
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreIn this paper, the reliability of the stress-strength model is derived for probability P(Y<X) of a component having its strength X exposed to one independent stress Y, when X and Y are following Gompertz Fréchet distribution with unknown shape parameters and known parameters . Different methods were used to estimate reliability R and Gompertz Fréchet distribution parameters, which are maximum likelihood, least square, weighted least square, regression, and ranked set sampling. Also, a comparison of these estimators was made by a simulation study based on mean square error (MSE) criteria. The comparison confirms that the performance of the maximum likelihood estimator is better than that of the other estimators.
يناقش هذا البحث مشكلة التعدد الخطي شبه التام في انموذج الانحدار اللاخطي ( انموذج الانحدار اللوجستي المتعدد) ، عندما يكون المتغير المعتمد متغير نوعيا يمثل ثنائي الاستجابة اما ان يساوي واحد لحدوث استجابة او صفر لعدم حدوث استجابة ، من خلال استعمال مقدرات المركبات الرئيسية التكرارية(IPCE) التي تعتمد على الاوزان الاعتيادية والاوزان البيزية الشرطية .
اذ تم تطبيق مقدرات هذا ا
... Show MoreIn this paper, we introduce and study a new concept named couniform modules, which is a dual notion of uniform modules, where an R-module M is said to be couniform if every proper submodule N of M is either zero or there exists a proper submodule N1 of N such that is small submodule of Also many relationships are given between this class of modules and other related classes of modules. Finally, we consider the hereditary property between R-module M and R-module R in case M is couniform.
A theoretical model is developed to determine time evolution of temperature at the surface of an opaque target placed in air for cases characterized by the formation of laser supported absorption waves (LSAW) plasmas. The model takes into account the power temporal variation throughout an incident laser pulse, (i.e. pulse shape, or simply: pulse profile).
Three proposed profiles are employed and results are compared with the square pulse approximation of a constant power.
In this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show More