A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
Copula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreAlthough its wide utilization in microbial cultures, the one factor-at-a-time method, failed to find the true optimum, this is due to the interaction between optimized parameters which is not taken into account. Therefore, in order to find the true optimum conditions, it is necessary to repeat the one factor-at-a-time method in many sequential experimental runs, which is extremely time-consuming and expensive for many variables. This work is an attempt to enhance bioactive yellow pigment production by Streptomyces thinghirensis based on a statistical design. The yellow pigment demonstrated inhibitory effects against Escherichia coli and Staphylococcus aureus and was characterized by UV-vis spectroscopy which showed lambda maximum of
... Show MoreThere has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide low bit error rates (BER) along with information security. The aim of such activity is to steal or distort the information being conveyed. Optical Wireless Systems (basically Free Space Optic Systems, FSO) are no exception to this trend. Thus, there is an urgent necessity to design techniques that can secure privileged information against unauthorized eavesdroppers while simultaneously protecting information against channel-induced perturbations and errors. Conventional cryptographic techniques are not designed
... Show MoreHealthcare professionals routinely use audio signals, generated by the human body, to help diagnose disease or assess its progression. With new technologies, it is now possible to collect human-generated sounds, such as coughing. Audio-based machine learning technologies can be adopted for automatic analysis of collected data. Valuable and rich information can be obtained from the cough signal and extracting effective characteristics from a finite duration time interval that changes as a function of time. This article presents a proposed approach to the detection and diagnosis of COVID-19 through the processing of cough collected from patients suffering from the most common symptoms of this pandemic. The proposed method is based on adopt
... Show More
The aim of the research is to identify the effect of instructional design according to Kagan structure among the first intermediate school student’s, and how skills could help in generating information in mathematics. In accordance with the research objectives, the researcher has followed the experimental research method by adopting an experimental design with two equivalent groups of post-test to measure skills in generating information. Accordingly, the researcher raised two main null hypotheses: there were no statistically significant differences at the level of significance (0.05) between the average scores of the experimental group who studied the material according to Kagan structure and th
... Show MoreAbstract
In the present study, composites were prepared by Hand lay-up molding. The composites constituents were epoxy resin as a matrix, 6% volume fractions of glass fibers (G.F) as reinforcement and 3%, 6% volume fractions of preparation natural material (Rice Husk Ash, Carrot Powder, and Sawdust) as filler. Studied the erosion wear behavior and coating by natural wastes (Rice Husk Ash) with epoxy resin after erosion. The results showed the non – reinforced epoxy have lower resistance erosion than natural based material composites and the specimen (Epoxy+6%glass fiber+6%RHA) has higher resistance erosion than composites reinforced with carrot powder and sawdust at 30cm , angle 60
... Show MoreA simple setup of random number generator is proposed. The random number generation is based on the shot-noise fluctuations in a p-i-n photodiode. These fluctuations that are defined as shot noise are based on a stationary random process whose statistical properties reflect Poisson statistics associated with photon streams. It has its origin in the quantum nature of light and it is related to vacuum fluctuations. Two photodiodes were used and their shot noise fluctuations were subtracted. The difference was applied to a comparator to obtain the random sequence.
Carbonate reservoirs are an essential source of hydrocarbons worldwide, and their petrophysical properties play a crucial role in hydrocarbon production. Carbonate reservoirs' most critical petrophysical properties are porosity, permeability, and water saturation. A tight reservoir refers to a reservoir with low porosity and permeability, which means it is difficult for fluids to move from one side to another. This study's primary goal is to evaluate reservoir properties and lithological identification of the SADI Formation in the Halfaya oil field. It is considered one of Iraq's most significant oilfields, 35 km south of Amarah. The Sadi formation consists of four units: A, B1, B2, and B3. Sadi A was excluded as it was not filled with h
... Show More