A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
We propose a novel strategy to optimize the test suite required for testing both hardware and software in a production line. Here, the strategy is based on two processes: Quality Signing Process and Quality Verification Process, respectively. Unlike earlier work, the proposed strategy is based on integration of black box and white box techniques in order to derive an optimum test suite during the Quality Signing Process. In this case, the generated optimal test suite significantly improves the Quality Verification Process. Considering both processes, the novelty of the proposed strategy is the fact that the optimization and reduction of test suite is performed by selecting only mutant killing test cases from cumulating t-way test ca
... Show MoreTo ensure fault tolerance and distributed management, distributed protocols are employed as one of the major architectural concepts underlying the Internet. However, inefficiency, instability and fragility could be potentially overcome with the help of the novel networking architecture called software-defined networking (SDN). The main property of this architecture is the separation of the control and data planes. To reduce congestion and thus improve latency and throughput, there must be homogeneous distribution of the traffic load over the different network paths. This paper presents a smart flow steering agent (SFSA) for data flow routing based on current network conditions. To enhance throughput and minimize latency, the SFSA distrib
... Show MoreBuilding a 3D geological model from field and subsurface data is a typical task in
geological studies involving natural resource evaluation and hazard assessment. In
this paper a 3D geological model for Asmari Reservoir in Fauqi oil field has been
built using petrel software. Asmari Reservoir belongs to (Oligocene- Lower
Miocene), it represents the second reservoir products after Mishrif Reservoir in Fauqi
field. Five wells namely FQ6, FQ7, FQ15, FQ20, FQ21 have been selected lying in
Missan governorate in order to build Structural and petrophysical (porosity and water
saturation) models represented by a 3D static geological model in three directions
.Structural model shows that Fauqi oil field represents un cylin
Petrel is regards one of the most important software to delineate subsurface Petrophysical properties to the reservoir. In this study, 3D Integrated geological models has been built by using Petrel software. The process includes integrated Petrophysical properties and environmental approaches.
Noor oil field within Mishrif Formation in terms of structural geology represents asymmetrical anticlinal fold with direction NW-SE. Porosity and water saturation model have been built. The reservoir was divided into several reservoirs and Nonreservoir units depends on the Petrophysical properties for each zone. In addition,
intact model for the reservoir in terms of porosity and water saturation have been b
The covid-19 pandemic sweeping the world and has rendered a large proportion of the workforce as they are unable to commute to work. This has resulted in employees and employers seeking alternative work arrangements, including the software industry. Then comes the need for the global market and international presence of many companies to implement the global virtual teams (GVTs). GVTs members are gradually engaged in globalized business environments across space, time and organizational boundaries via information and communication technologies. Despite the advancement of technology, the project managers are still facing many challenges in communication. Hense, to become a successful project manager still a big challenge for them. This study
... Show MoreThe aim of this paper is to estimate a single reliability system (R = P, Z > W) with a strength Z subjected to a stress W in a stress-strength model that follows a power Rayleigh distribution. It proposes, generates and examines eight methods and techniques for estimating distribution parameters and reliability functions. These methods are the maximum likelihood estimation(MLE), the exact moment estimation (EMME), the percentile estimation (PE), the least-squares estimation (LSE), the weighted least squares estimation (WLSE) and three shrinkage estimation methods (sh1) (sh2) (sh3). We also use the mean square error (MSE) Bias and the mean absolute percentage error (MAPE) to compare the estimation methods. Both theoretical c
... Show MoreThis paper concerns with deriving and estimating the reliability of the multicomponent system in stress-strength model R(s,k), when the stress and strength are identical independent distribution (iid), follows two parameters Exponentiated Pareto Distribution(EPD) with the unknown shape and known scale parameters. Shrinkage estimation method including Maximum likelihood estimator (MLE), has been considered. Comparisons among the proposed estimators were made depending on simulation based on mean squared error (MSE) criteria.
Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreSecure information transmission over the internet is becoming an important requirement in data communication. These days, authenticity, secrecy, and confidentiality are the most important concerns in securing data communication. For that reason, information hiding methods are used, such as Cryptography, Steganography and Watermarking methods, to secure data transmission, where cryptography method is used to encrypt the information in an unreadable form. At the same time, steganography covers the information within images, audio or video. Finally, watermarking is used to protect information from intruders. This paper proposed a new cryptography method by using thre
... Show More