A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
The spatial assessment criteria system for hybridizing renewable energy sources, such as hybrid solar-wind farms, is critical in selecting ideal installation sites that maximize benefits, reduce costs, protect the environment, and serve the community. However, a systematic approach to designing indicator systems is rarely used in relevant site selection studies. Therefore, the current paper attempts to present an inclusive framework based on content validity to create an effective criteria system for siting wind-solar plants. To this end, the criteria considered in the related literature are captured, and the top 10 frequent indicators are identified. The Delphi technique is used to subject commonly used factors to expert judgme
... Show MoreThe paper aims to propose Teaching Learning based Optimization (TLBO) algorithm to solve 3-D packing problem in containers. The objective which can be presented in a mathematical model is optimizing the space usage in a container. Besides the interaction effect between students and teacher, this algorithm also observes the learning process between students in the classroom which does not need any control parameters. Thus, TLBO provides the teachers phase and students phase as its main updating process to find the best solution. More precisely, to validate the algorithm effectiveness, it was implemented in three sample cases. There was small data which had 5 size-types of items with 12 units, medium data which had 10 size-types of items w
... Show MoreFace Identification system is an active research area in these years. However, the accuracy and its dependency in real life systems are still questionable. Earlier research in face identification systems demonstrated that LBP based face recognition systems are preferred than others and give adequate accuracy. It is robust against illumination changes and considered as a high-speed algorithm. Performance metrics for such systems are calculated from time delay and accuracy. This paper introduces an improved face recognition system that is build using C++ programming language with the help of OpenCV library. Accuracy can be increased if a filter or combinations of filters are applied to the images. The accuracy increases from 95.5% (without ap
... Show MoreThe physical substance at high energy level with specific circumstances; tend to behave harsh and complicated, meanwhile, sustaining equilibrium or non-equilibrium thermodynamic of the system. Measurement of the temperature by ordinary techniques in these cases is not applicable at all. Likewise, there is a need to apply mathematical models in numerous critical applications to measure the temperature accurately at an atomic level of the matter. Those mathematical models follow statistical rules with different distribution approaches of quantities energy of the system. However, these approaches have functional effects at microscopic and macroscopic levels of that system. Therefore, this research study represents an innovative of a wi
... Show MoreIn this research, we will discuss how to improve the work by dealing with the factors that
participates in enhancing small IT organization to produce the software using the suitable
development process supported by experimental theories to achieve the goals. Starting from
the selecting of the methodology to implement the software. The steps used are and should be
compatible with the type of the products the organization will produce and here it is the Web-Based Project Development.
The researcher suggest Extreme Programming (XP) as a methodology for the Web-Based
Project Development and justifying this suggestion and that will guide to know how the
methodology is very important and effective in the software dev
Orthogonal Frequency Division Multiplexing (OFDM) is an efficient multi-carrier technique.The core operation in the OFDM systems is the FFT/IFFT unit that requires a large amount of hardware resources and processing delay. The developments in implementation techniques likes Field Programmable Gate Array (FPGA) technologies have made OFDM a feasible option. The goal of this paper is to design and implement an OFDM transmitter based on Altera FPGA using Quartus software. The proposed transmitter is carried out to simplify the Fourier transform calculation by using decoder instead of multipliers. After programming ALTERA DE2 FPGA kit with implemented project, several practical tests have been done starting from monitoring all the results of
... Show MoreThe study aimed to reveal the level of knowledge and tendencies of high- study students specializing in curriculum and teaching methods at King Khalid University towards harmonious strategies with brain-based learning (BBL). And Then, putting a proposed concept to develop knowledge and tendencies of high-study students specializing in curriculum and teaching methods at King Khalid University towards harmonious strategies with Brain-based learning (BBL). For achieving this goal, a cognitive test and a scale of tendency were prepared to apply harmonious strategies with brain-based learning. The descriptive approach was used because it suits the goals of the study. The study sample consisted of (70) male and female students of postgraduate
... Show MoreThe present study investigates deep eutectic solvents (DESs) as potential media for enzymatic hydrolysis. A series of ternary ammonium and phosphonium-based DESs were prepared at different molar ratios by mixing with aqueous glycerol (85%). The physicochemical properties including surface tension, conductivity, density, and viscosity were measured at a temperature range of 298.15 K – 363.15 K. The eutectic points were highly influenced by the variation of temperature. The eutectic point of the choline chloride: glycerol: water (ratio of 1: 2.55: 2.28) and methyltriphenylphosphonium bromide:glycerol:water (ratio of 1: 4.25: 3.75) is 213.4 K and 255.8 K, respectively. The stability of the lipase enzyme isolated from porcine pancreas (PPL) a
... Show MoreThe Internet of Things (IoT) is a network of devices used for interconnection and data transfer. There is a dramatic increase in IoT attacks due to the lack of security mechanisms. The security mechanisms can be enhanced through the analysis and classification of these attacks. The multi-class classification of IoT botnet attacks (IBA) applied here uses a high-dimensional data set. The high-dimensional data set is a challenge in the classification process due to the requirements of a high number of computational resources. Dimensionality reduction (DR) discards irrelevant information while retaining the imperative bits from this high-dimensional data set. The DR technique proposed here is a classifier-based fe
... Show MoreExtractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datas
... Show More