Preferred Language
Articles
/
ijs-2949
Software Fault Estimation Tool Based on Object-Oriented Metrics
...Show More Authors

A fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted directly the number of classes detected, which ranged between 1-20 and 1-7 for the original dataset and 1-7 and 0-3) after removing redundancy and log transformation. The Skewness of the dataset was deceased after applying the proposed model. The classified faulty classes need more attention in the next versions in order to reduce the ratio of faults or to do refactoring to increase the quality and performance of the current version of the software.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Jan 18 2022
Journal Name
Iraqi Journal Of Science
Image Encryption Based on Intelligent Session Mask Keys
...Show More Authors

The revolution of multimedia has been a driving force behind fast and secured data transmission techniques. The security of image information from unapproved access is imperative. Encryptions technique is used to transfer data, where each kind of data has its own special elements; thus various methods should to be used to conserve distributing the image. This paper produces image encryption improvements based on proposed an approach to generate efficient intelligent session (mask keys) based on investigates from the combination between robust feature for ECC algebra and construction level in Greedy Randomized Adaptive Search Procedure (GRASP) to produce durable symmetric session mask keys consist of ECC points. Symmetric behavior for ECC

... Show More
View Publication Preview PDF
Publication Date
Thu Dec 29 2016
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Proposal for Exchange Text message Based on Image
...Show More Authors

     The messages are ancient method to exchange information between peoples. It had many ways to send it with some security.

    Encryption and steganography was oldest ways to message security, but there are still many problems in key generation, key distribution, suitable cover image and others. In this paper we present proposed algorithm to exchange security message without any encryption, or image as cover to hidden. Our proposed algorithm depends on two copies of the same collection images set (CIS), one in sender side and other in receiver side which always exchange message between them.

      To send any message text the sender converts message to ASCII c

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 30 2022
Journal Name
Iraqi Journal Of Science
Short Answers Assessment Approach based on Semantic Network
...Show More Authors

      Finding similarities in texts is important in many areas such as information retrieval, automated article scoring, and short answer categorization. Evaluating short answers is not an easy task due to differences in natural language. Methods for calculating the similarity between texts depend on semantic or grammatical aspects. This paper discusses a method for evaluating short answers using semantic networks to represent the typical (correct) answer and students' answers. The semantic network of nodes and relationships represents the text (answers). Moreover, grammatical aspects are found  by measuring the similarity of parts of speech between the answers. In addition, finding hierarchical relationships between nodes in netwo

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Thu Jan 14 2021
Journal Name
Iraqi Journal Of Science
Identifying Digital Forensic Frameworks Based on Processes Models
...Show More Authors

Digital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.

View Publication Preview PDF
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Mon Jun 17 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Ebonite linings Based on Natural and Synthetic Rubber
...Show More Authors

 The corrosion of metals is of great economic importance. Estimates show that the quarter of the iron and the steel produced is destroyed in this way. Rubber lining has been used for severe corrosion protection because NR and certain synthetic rubbers have a basic resistance to the very corrosive chemicals particularly acids. The present work includes producing ebonite from both natural and synthetic rubbers ; therefore, the following materials were chosen to produce ebonite rubber: a) Natural rubber (NR). b) Styrene butadiene rubber (SBR). c) Nitrile rubber (NBR). d) Neoprene rubber (CR) [WRT]. The best ebonite vulcanizates are obtained in the presence of 30 Pphr sulfur, and carbon black as reinforcing filler. The relation between

... Show More
View Publication Preview PDF
Publication Date
Wed May 01 2019
Journal Name
Iraqi Journal Of Science
Optical Images Fusion Based on Linear Interpolation Methods
...Show More Authors

Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA).  Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a

... Show More
View Publication Preview PDF
Publication Date
Thu Jun 30 2022
Journal Name
Iraqi Journal Of Science
Telecom Churn Prediction based on Deep Learning Approach
...Show More Authors

      The transition of customers from one telecom operator to another has a direct impact on the company's growth and revenue. Traditional classification algorithms fail to predict churn effectively. This research introduces a deep learning model for predicting customers planning to leave to another operator. The model works on a high-dimensional large-scale data set. The performance of the model was measured against other classification algorithms, such as Gaussian NB, Random Forrest, and Decision Tree in predicting churn. The evaluation was performed based on accuracy, precision, recall, F-measure, Area Under Curve (AUC), and Receiver Operating Characteristic (ROC) Curve. The proposed deep learning model performs better than othe

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Sun Nov 01 2020
Journal Name
Journal Of Physics: Conference Series
Improve topic modeling algorithms based on Twitter hashtags
...Show More Authors
Abstract<p>Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned</p> ... Show More
View Publication
Scopus (17)
Crossref (14)
Scopus Crossref
Publication Date
Wed May 25 2022
Journal Name
Iraqi Journal Of Science
Developing a Heuristic Algorithm to Solve Uncertainty Problem of Resource Allocation in a Software Project Scheduling
...Show More Authors

     In project management process, the objective is to define and develop a model for planning, scheduling, controlling, and monitoring different activities of a particular project. Time scheduling plays an important role in successful implementation of various activities and general outcome of project. In practice, various factors cause projects to suffer from time delay in accomplishing the activities. One important reason is imprecise knowledge about time duration of activities. This study addresses the problem of project scheduling in uncertain resource environments, which are defined by uncertain activity durations.  The study presents a solution of the levelling and allocation problems for projects that have some uncertain ac

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Oct 01 2006
Journal Name
Journal Of The Faculty Of Medicine Baghdad
Modified Automated Scoring system for Immunohistochemical staining using commercially available low cost software for image analysis
...Show More Authors

Background: During the past several years, there has been a rapidly escalating clinical need to perform IHC stains that require quantitative interpretation. Automated Cellular
Imaging System is used to analyze immunohistochemically stained slides, primarily for cancer-related diagnostics. Studies have shown that the device offers accuracy,
precision, and reproducibility of immunostained slide analysis exceeding that possible with manual evaluation, which was the prevailing method.
Aim of the study In this article we will demonstrate that meaningful image analysis of immunohistochemical staining studies can be performed using inexpensive, widely distributed
graphics software (Adobe Photoshop) on a personal

... Show More
View Publication Preview PDF
Crossref