A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
Trimmed Linear moments (TL-moments) are natural generalization of L-moments that do not require the mean of the underlying distribution to exist. It is known that the sample TL-moments is unbiased estimators to corresponding population TL-moment. Since different choices for the amount of trimming give different values of the estimators it is important to choose the estimator that has minimum mean squares error than others. Therefore, we derive an optimal choice for the amount of trimming from known distributions based on the minimum errors between the estimators. Moreover, we study simulation-based approach to choose an optimal amount of trimming and maximum like hood method by computing the estimators and mean squares error for range of
... Show MoreMany organizations today are interesting to implementing lean manufacturing principles that should enable them to eliminating the wastes to reducing a manufacturing lead time. This paper concentrates on increasing the competitive level of the company in globalization markets and improving of the productivity by reducing the manufacturing lead time. This will be by using the main tool of lean manufacturing which is value stream mapping (VSM) to identifying all the activities of manufacturing process (value and non-value added activities) to reducing elimination of wastes (non-value added activities) by converting a manufacturing system to pull instead of push by applying some of pull system strategies a
... Show MoreIn this paper, a new procedure is introduced to estimate the solution for the three-point boundary value problem which is instituted on the use of Morgan-Voyce polynomial. In the beginning, Morgan-Voyce polynomial along with their important properties is introduced. Next, this polynomial with aid of the collocation method utilized to modify the differential equation with boundary conditions to the algebraic system. Finally, the examples approve the validity and accuracy of the proposed method.
This paper deals with finding the approximation solution of a nonlinear parabolic boundary value problem (NLPBVP) by using the Galekin finite element method (GFEM) in space and Crank Nicolson (CN) scheme in time, the problem then reduce to solve a Galerkin nonlinear algebraic system(GNLAS). The predictor and the corrector technique (PCT) is applied here to solve the GNLAS, by transforms it to a Galerkin linear algebraic system (GLAS). This GLAS is solved once using the Cholesky method (CHM) as it appear in the matlab package and once again using the Cholesky reduction order technique (CHROT) which we employ it here to save a massive time. The results, for CHROT are given by tables and figures and show
... Show MoreThe paper proposes a methodology for predicting packet flow at the data plane in smart SDN based on the intelligent controller of spike neural networks(SNN). This methodology is applied to predict the subsequent step of the packet flow, consequently reducing the overcrowding that might happen. The centralized controller acts as a reactive controller for managing the clustering head process in the Software Defined Network data layer in the proposed model. The simulation results show the capability of Spike Neural Network controller in SDN control layer to improve the (QoS) in the whole network in terms of minimizing the packet loss ratio and increased the buffer utilization ratio.
Nuclear medicine is important for both diagnosis and treatment. The most common treatment for diseases is radiation therapy used against cancer. The radiation intensity of the treatment is often less than its ability to cause damage, so radiation must be carefully controlled. The interactions of alpha particle with matter were studied and the stopping powers of alpha particle with ovary tissue were calculated using Beth-Bloch equation, Zeigler’s formula and SRIM Software also the range and Liner Energy Transfer (LET) and ovary thickness as well as dose and dose equivalent for this particle were calculated by using Matlab language for (0.01-200) MeV alpha energy.
In this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of mo
... Show MoreIn this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
Standardized uptake values, often known as SUVs, are frequently utilized in the process of measuring 18F-fluorodeoxyglucose (FDG) uptake in malignancies . In this work, we investigated the relationships between a wide range of parameters and the standardized uptake values (SUV) found in the liver. Examinations with 18F-FDG PET/CT were performed on a total of 59 patients who were suffering from liver cancer. We determined the SUV in the liver of patients who had a normal BMI (between 18.5 and 24.9) and a high BMI (above 30) obese. After adjusting each SUV based on the results of the body mass index (BMI) and body surface area (BSA) calculations, which were determined for each patient based on their height and weight. Under a variety of dif
... Show MoreThe energy expectation values for Li and Li-like ions ( , and ) have been calculated and examined within the ground state and the excited state in position space. The partitioning technique of Hartree-Fock (H-F) has been used for existing wave functions.