Preferred Language
Articles
/
ijs-1154
Improved VSM Based Candidate Retrieval Model for Detecting External Textual Plagiarism
...Show More Authors

A rapid growth has occurred for the act of plagiarism with the aid of Internet explosive growth wherein a massive volume of information offered with effortless use and access makes plagiarism  the process of taking someone else’s work (represented by ideas, or even words) and representing it as other's own work  easy to be performed. For ensuring originality, detecting plagiarism has been massively necessitated in various areas so that the people who aim to plagiarize ought to offer considerable effort for introducing works centered on their research.

     In this paper, work has been proposed for improving the detection of textual plagiarism through proposing a model for candidate retrieval phase. The model proposed for retrieving candidates has adopted the vector space method VSM as a retrieval model and centered on representing documents as vectors consisting of average term  weights and considering them as queries for retrieval instead of representing them as vectors of term  weight. The detailed comparison task comes as the second phase wherein fuzzy semantic based string similarity has been applied. Experiments have been conducted using PAN-PC-10 as an evaluation dataset for evaluating the proposed system. As the problem statement in this paper is restricted to detect extrinsic plagiarism and works on English documents, experiments have been performed on the portion dedicated to extrinsic detection and on documents in English language only. For evaluating performance of the proposed model for retrieving candidates, Precision, Recall, and F-measure have been used as an evaluation metrics. The overall performance of the proposed system has been assessed through the use of the five standard PAN measures Precision, Recall, F-measure, Granularity and . The experimental results have clarified that the proposed model for retrieving candidates has a positive impact on the overall performance of the system and the system outperforms the other state-of-the-art methods. They clarified that the proposed model has detected about 80% of the plagiarism cases and about 90% of the detections were correct. The proposed model has the ability to detect literal plagiarism in addition to cases containing paraphrasing. Performance comparison has clarified that the proposed system is either comparable or outperforms the other baseline systems in terms of the five  evaluation metrics.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Dec 29 2016
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Proposed Steganography Method Based on DCT Coefficients
...Show More Authors

      In this paper an algorithm for Steganography using DCT for cover image and DWT for hidden image with an embedding order key is proposed. For more security and complexity the cover image convert from RGB to YIQ, Y plane is used and divided into four equally parts and then converted to DCT domain. The four coefficient of the DWT of the hidden image are embedded into each part of cover DCT, the embedding order based on the order key of which is stored with cover in a database table in both the sender and receiver sender. Experimental results show that the proposed algorithm gets successful hiding information into the cover image. We use Microsoft Office Access 2003 database as DBMS, the hiding, extracting algo

... Show More
View Publication Preview PDF
Publication Date
Fri Sep 30 2022
Journal Name
Iraqi Journal Of Science
Network Traffic Prediction Based on Boosting Learning
...Show More Authors

Classification of network traffic is an important topic for network management, traffic routing, safe traffic discrimination, and better service delivery. Traffic examination is the entire process of examining traffic data, from intercepting traffic data to discovering patterns, relationships, misconfigurations, and anomalies in a network. Between them, traffic classification is a sub-domain of this field, the purpose of which is to classify network traffic into predefined classes such as usual or abnormal traffic and application type. Most Internet applications encrypt data during traffic, and classifying encrypted data during traffic is not possible with traditional methods. Statistical and intelligence methods can find and model traff

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sat Jan 30 2021
Journal Name
Iraqi Journal Of Science
Image Compression Based on Arithmetic Coding Algorithm
...Show More Authors

The past years have seen a rapid development in the area of image compression techniques, mainly due to the need of fast and efficient techniques for storage and transmission of data among individuals. Compression is the process of representing the data in a compact form rather than in its original or incompact form. In this paper, integer implementation of Arithmetic Coding (AC) and Discreet Cosine Transform (DCT) were applied to colored images. The DCT was applied using the YCbCr color model. The transformed image was then quantized with the standard quantization tables for luminance and chrominance. The quantized coefficients were scanned by zigzag scan and the output was encoded using AC. The results showed a decent compression ratio

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (5)
Scopus Crossref
Publication Date
Thu Jan 01 2015
Journal Name
Iet Colloquium On Millimetre-wave And Terahertz Engineering & Technology 2015
Millimetre wave semiconductor based isolators and circulators
...Show More Authors

View Publication
Crossref (2)
Crossref
Publication Date
Fri Aug 23 2013
Journal Name
International Journal Of Computer Applications
Image Compression based on Quadtree and Polynomial
...Show More Authors

View Publication
Crossref (3)
Crossref
Publication Date
Thu Jun 04 2020
Journal Name
Journal Of Discrete Mathematical Sciences And Cryptography
User authentication system based specified brain waves
...Show More Authors

A security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear

... Show More
Scopus (6)
Scopus
Publication Date
Fri Jul 18 2014
Journal Name
International Journal Of Computer Applications
3-Level Techniques Comparison based Image Recognition
...Show More Authors

Image recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third

... Show More
View Publication
Crossref
Publication Date
Tue Aug 23 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Face mask detection based on algorithm YOLOv5s
...Show More Authors

Determining the face of wearing a mask from not wearing a mask from visual data such as video and still, images have been a fascinating research topic in recent decades due to the spread of the Corona pandemic, which has changed the features of the entire world and forced people to wear a mask as a way to prevent the pandemic that has calmed the entire world, and it has played an important role. Intelligent development based on artificial intelligence and computers has a very important role in the issue of safety from the pandemic, as the Topic of face recognition and identifying people who wear the mask or not in the introduction and deep education was the most prominent in this topic. Using deep learning techniques and the YOLO (”You on

... Show More
Publication Date
Mon Oct 02 2023
Journal Name
Journal Of Engineering
Skull Stripping Based on the Segmentation Models
...Show More Authors

Skull image separation is one of the initial procedures used to detect brain abnormalities. In an MRI image of the brain, this process involves distinguishing the tissue that makes up the brain from the tissue that does not make up the brain. Even for experienced radiologists, separating the brain from the skull is a difficult task, and the accuracy of the results can vary quite a little from one individual to the next. Therefore, skull stripping in brain magnetic resonance volume has become increasingly popular due to the requirement for a dependable, accurate, and thorough method for processing brain datasets. Furthermore, skull stripping must be performed accurately for neuroimaging diagnostic systems since neither non-brain tissues nor

... Show More
Publication Date
Sat Jan 01 2011
Journal Name
Iraqi Journal Of Science
A CRYPTOGRAPHIC TECHNIQUE BASED ON AVL TREE
...Show More Authors