A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
The aim of this research is to explore the time and space distribution of traffic volume demand and investigate its vehicle compositions. The four selected links presented the activity of transportation facilities and different congestion points according to directions. The study area belongs to Al-Rusafa sector in Baghdad city that exhibited higher rate of traffic congestions of working days at peak morning and evening periods due to the different mixed land uses. The obtained results showed that Link (1) from Medical city intersection to Sarafiya intersection, demonstrated the highest traffic volume in both peak time periods morning AM and afternoon PM where the demand exceeds the capacity along the link corridor. Also, higher values f
... Show MoreUsing the Neural network as a type of associative memory will be introduced in this paper through the problem of mobile position estimation where mobile estimate its location depending on the signal strength reach to it from several around base stations where the neural network can be implemented inside the mobile. Traditional methods of time of arrival (TOA) and received signal strength (RSS) are used and compared with two analytical methods, optimal positioning method and average positioning method. The data that are used for training are ideal since they can be obtained based on geometry of CDMA cell topology. The test of the two methods TOA and RSS take many cases through a nonlinear path that MS can move through tha
... Show MoreBackground: Indeterminate colitis (IC) originally referred to those 10–15% of cases of inflammatory bowel disease (IBD) in which there was difficulty distinguishing between ulcerative colitis (UC) and Crohn’s disease (CD) in the colectomy specimen and histopathology examination. However, IC is increasingly used when a definitive diagnosis of UC or CD cannot be made at colonoscopy examination, colonic biopsies or at colectomy. The diagnostic difficulties may explain the variably reported prevalence of IC. Clinically, most patients with IC evolve to a definite diagnosis of UC or CD on follow up.
Patients and methods: PATIENTS GROUP: Consisted of 80 patients with indeterminate colitis (IC), their ag
the present study is designed to evaluate the effect of low level laser irradiation on the immume system when administere intravenoisly
A dynamic experimental study of thermal decomposition of low density polyethylene has been carried out with two different heating rates .As usual , we can determine the activation energy of any polymer using( 3 - 6 ) TGA experiment as minimum , but in this work , we estimate the activation energy of LDPE using two of TGA experiments only
Conservative pipes conveying fluid such as pinned-pinned (p-p), clamped–pinned (c-p) pipes and clamped-clamped (c-c) lose their stability by buckling at certain critical fluid velocities. In order to experimentally evaluate these velocities, high flow-rate pumps that demand complicated fluid circuits must be used.
This paper studies a new experimental approach based on estimating the critical velocities from the measurement of several fundamental natural frequencies .In this approach low flow-rate pumps and simple fluid circuit can be used.
Experiments were carried out on two pipe models at three different boundary conditions. The results showed that the present approach is more accurate for est
... Show MoreObjective: The evaluation of serum osteocalcin (OSN) for Iraqi infertile patients to see the effect of osteocalcin insufficiency, which may lead to a decreased level of testosterone production in males that may cause infertility. Methods: Forty two newly diagnosed infertile males age range (24–47) years and thirty two apparently healthy males as controls age range (25–58) years. Serum levels of testosterone (TEST), stimulating follicle hormone (FSH) and luteinizing hormone (LH), prolactin (PROL), osteocalcin OSN, and fasting blood sugar (FBS) were performed in both patients and controls. Estimation of serum OSN by Immulite1000 auto-analyzer, TEST, FSH, LH, PROL, and FBS by Immulite2000 auto-analyzer. Results: Infertile patients
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
Metaheuristic is one of the most well-known fields of research used to find optimum solutions for non-deterministic polynomial hard (NP-hard) problems, for which it is difficult to find an optimal solution in a polynomial time. This paper introduces the metaheuristic-based algorithms and their classifications and non-deterministic polynomial hard problems. It also compares the performance of two metaheuristic-based algorithms (Elephant Herding Optimization algorithm and Tabu Search) to solve the Traveling Salesman Problem (TSP), which is one of the most known non-deterministic polynomial hard problems and widely used in the performance evaluations for different metaheuristics-based optimization algorithms. The experimental results of Ele
... Show More