A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.
Digital forensics has become a fundamental requirement for law enforcement due to the growing volume of cyber and computer-assisted crime. Whilst existing commercial tools have traditionally focused upon string-based analyses (e.g., regular expressions, keywords), less effort has been placed towards the development of multimedia-based analyses. Within the research community, more focus has been attributed to the analysis of multimedia content; they tend to focus upon highly specialised specific scenarios such as tattoo identification, number plate recognition, suspect face recognition and manual annotation of images. Given the ever-increasing volume of multimedia content, it is essential that a holistic Multimedia-Forensic Analysis Tool (M-
... Show MoreIt sheds light on the cultural dimension in the decentralized administration experience in Iraq as evaluation tool by adding a new dimension in studying decentralization in addition to the administrative, political and legal dimensions .The proposal of study is that without the civil political culture ,the administrative decentralization in Iraq which remains weak and it may be imposed to a set of problems. - The study includes an introduction, conclusion and a set of political and cultural issues , its components and it may have the political analysis of this culture and its role in determining the movement of the administrative decentralized movement in Iraq post 2003.
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreToday we are witnessing huge scientific and technical progress so we need more skills and methods of thinking that needs to be acquired by the teacher, as the importance of computers in education there are many teachers suffering of the difficulty in teaching for pupils . researchers tried to find a good suitable way with the technological interests for now which represent by computer design software and the introduction of enrichment activities in the curriculum because it is one of a contemporary trends for the development of the Arabic language with various levels of education and knowing if this program has negative or positive impact.
So 
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreBy driven the moment estimator of ARMA (1, 1) and by using the simulation some important notice are founded, From the more notice conclusions that the relation between the sign and moment estimator for ARMA (1, 1) model that is: when the sign is positive means the root gives invertible model and when the sign is negative means the root gives invertible model. An alternative method has been suggested for ARMA (0, 1) model can be suitable when
The need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2,0,0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlation coefficien
... Show MoreThe need to create the optimal water quality management process has motivated researchers to pursue prediction modeling development. One of the widely important forecasting models is the sessional autoregressive integrated moving average (SARIMA) model. In the present study, a SARIMA model was developed in R software to fit a time series data of monthly fluoride content collected from six stations on Tigris River for the period from 2004 to 2014. The adequate SARIMA model that has the least Akaike's information criterion (AIC) and mean squared error (MSE) was found to be SARIMA (2, 0, 0) (0,1,1). The model parameters were identified and diagnosed to derive the forecasting equations at each selected location. The correlat
... Show More