A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
This study aims to assess the formation evaluation of the Jeribe Formation in Hamrin oilfield. The present study involved four selected wells of (Early- Mid Miocene) Jeribe Formation in Hamrin structure-Allas field; HR-2, HR-8, HR-9, and HR-16 located North of Iraq. The work deals with the available data that includes the most required information to improve such studies. Techlog Software V.2015 was used to carry out a reliable calculation of petrophysical properties utilizing conventional logs to determine the reservoir characteristics (lithology, porosity, and saturation). The computed CPI (software resulted) based on log information divided the Jeribe reservoir into two reservoir units (Jr-1 and Jr
... Show MoreThe 2D imaging survey was carried out using Wenner-Schlumberger array through (11) 2D survey lines distributed within and out of Abu-Jir fault zone, Southwest of Karbala City, central Iraq. The aim is to delineate subsurface fractures density. The total length of each 2D survey line is (600m.) with the unit electrode spacing (a) equals to (10m.).The results showed two types of fractures zones. The first type is formed by dissolution process of carbonate rocks, while the second fractures zone is formed from tectonic movements, and it includes two types of fractures system, oblique and vertical fractures.
This study includes comparison between subsurface fracture density within and out of Abu- Jir fault zone. This comparison showed that
Experience the Islamic financial industry faces many challenges, most notably the lack of proper risk management tools that meet the requirements of legality and economic efficiency advantage from another side, so it requires the search for innovative ways to manage the risk of Islamic banking, Islamic finance industry is manufacture up-to-date, if compared with the financial industry (traditional), which increases the problematic of risk management in the Islamic financial industry nature of treatment which should be compatible with Islamic law, as well as economic efficiency, thereby Progress came the importance of research to highlight the entrance to Islamic financial engineering and the goals sought to be achieved through the use of
... Show MoreBackground:Â Various fluids in the oral environment can affect the surface roughness of resin composites. This in vitro study was conducted to determine the influence of the mouth rinses on surface roughness of two methacrylate-based resin (nanofilled and packable composite) and siloraine-based resin composites.
Materials and methods: Disc-shaped specimens (12 mm in diameter and 2mm in height) were prepared from three types of composi
... Show MoreIn this article, Convolution Neural Network (CNN) is used to detect damage and no damage images form satellite imagery using different classifiers. These classifiers are well-known models that are used with CNN to detect and classify images using a specific dataset. The dataset used belongs to the Huston hurricane that caused several damages in the nearby areas. In addition, a transfer learning property is used to store the knowledge (weights) and reuse it in the next task. Moreover, each applied classifier is used to detect the images from the dataset after it is split into training, testing and validation. Keras library is used to apply the CNN algorithm with each selected classifier to detect the images. Furthermore, the performa
... Show MoreAs an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreCloud computing is a pay-as-you-go model that provides users with on-demand access to services or computing resources. It is a challenging issue to maximize the service provider's profit and, on the other hand, meet the Quality of Service (QoS) requirements of users. Therefore, this paper proposes an admission control heuristic (ACH) approach that selects or rejects the requests based on budget, deadline, and penalty cost, i.e., those given by the user. Then a service level agreement (SLA) is created for each selected request. The proposed work uses Particle Swarm Optimization (PSO) and the Salp Swarm Algorithm (SSA) to schedule the selected requests under budget and deadline constraints. Performances of PSO and SSA with and witho
... Show MoreThis research includes the use of CdTe in the design of a solar cell. The SCAPS-1D computer program was used to simulate thin cell capacity of CdTe/CdS by numerical analysis with the addition of a buffer layer (Zn2SnO4) to enhance cell efficiency. The thickness of the window layer (n-CdS) was reduced to 25nm with the inclusion of an insulating layer of 50 nm thickness to prevent leakage towards the forward bias with respect to the lower charge carriers. As for the absorber layer thickness (p-CdTe), it varied between 0.5µm and 6µm. The preferable thickness in the absorbent layer was 1.5µm. Different operating temperatures (298K-388K) were used, while the highest conversion efficiency (η=18.43%) was obtain
... Show More