Preferred Language
Articles
/
cRfQNY8BVTCNdQwCtmIm
VSM Based Models and Integration of Exact and Fuzzy Similarity For Improving Detection of External Textual Plagiarism admin June 29, 2019
...Show More Authors

Clarivate Crossref
View Publication
Publication Date
Mon Oct 28 2019
Journal Name
Iraqi Journal Of Science
Improved VSM Based Candidate Retrieval Model for Detecting External Textual Plagiarism
...Show More Authors

A rapid growth has occurred for the act of plagiarism with the aid of Internet explosive growth wherein a massive volume of information offered with effortless use and access makes plagiarism  the process of taking someone else’s work (represented by ideas, or even words) and representing it as other's own work  easy to be performed. For ensuring originality, detecting plagiarism has been massively necessitated in various areas so that the people who aim to plagiarize ought to offer considerable effort for introducing works centered on their research.

     In this paper, work has been proposed for improving the detection of textual plagiarism through proposing a model for can

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Mon Oct 28 2019
Journal Name
Journal Of Mechanics Of Continua And Mathematical Sciences
Heuristic Initialization And Similarity Integration Based Model for Improving Extractive Multi-Document Summarization
...Show More Authors

View Publication
Clarivate Crossref
Publication Date
Tue Feb 01 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Computer-based plagiarism detection techniques: A comparative study
...Show More Authors

Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and

... Show More
Publication Date
Fri Jun 30 2023
Journal Name
Iraqi Journal Of Science
Using Retrieved Sources for Semantic and Lexical Plagiarism Detection
...Show More Authors

     Plagiarism is described as using someone else's ideas or work without their permission. Using lexical and semantic text similarity notions, this paper presents a plagiarism detection system for examining suspicious texts against available sources on the Web. The user can upload suspicious files in pdf or docx formats. The system will search three popular search engines for the source text (Google, Bing, and Yahoo) and try to identify the top five results for each search engine on the first retrieved page. The corpus is made up of the downloaded files and scraped web page text of the search engines' results. The corpus text and suspicious documents will then be encoded as vectors. For lexical plagiarism detection, the system will

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Mon May 28 2018
Journal Name
Iraqi Journal Of Science
M A Modified Similarity Measure for Improving Accuracy of User-Based Collaborative Filtering: Nadia Fadhil
...Show More Authors

Production sites suffer from idle in marketing of their products because of the lack in the efficient systems that analyze and track the evaluation of customers to products; therefore some products remain untargeted despite their good quality. This research aims to build a modest model intended to take two aspects into considerations. The first aspect is diagnosing dependable users on the site depending on the number of products evaluated and the user's positive impact on rating. The second aspect is diagnosing products with low weights (unknown) to be generated and recommended to users depending on logarithm equation and the number of co-rated users. Collaborative filtering is one of the most knowledge discovery techniques used positive

... Show More
View Publication Preview PDF
Publication Date
Tue Aug 31 2021
Journal Name
Iraqi Journal Of Science
Plagiarism Detection Methods and Tools: An Overview
...Show More Authors

Plagiarism Detection Systems play an important role in revealing instances of a plagiarism act, especially in the educational sector with scientific documents and papers. The idea of plagiarism is that when any content is copied without permission or citation from the author. To detect such activities, it is necessary to have extensive information about plagiarism forms and classes. Thanks to the developed tools and methods it is possible to reveal many types of plagiarism. The development of the Information and Communication Technologies (ICT) and the availability of the online scientific documents lead to the ease of access to these documents. With the availability of many software text editors, plagiarism detections becomes a critical

... Show More
View Publication Preview PDF
Scopus (24)
Crossref (16)
Scopus Crossref
Publication Date
Wed Aug 01 2012
Journal Name
International Journal Of Geographical Information Science
Assessing similarity matching for possible integration of feature classifications of geospatial data from official and informal sources
...Show More Authors

View Publication
Scopus (61)
Crossref (48)
Scopus Clarivate Crossref
Publication Date
Fri Dec 30 2022
Journal Name
Iraqi Journal Of Science
An Improved Outlier Detection Model for Detecting Intrinsic Plagiarism
...Show More Authors

     In the task of detecting intrinsic plagiarism, the cases where reference corpus is absent are to be dealt with. This task is entirely based on inconsistencies within a given document. Detection of internal plagiarism has been considered as a classification problem. It can be estimated through taking into consideration self-based information from a given document.

The core contribution of the work proposed in this paper is associated with the document representation. Wherein, the document, also, the disjoint segments generated from it, have been represented as weight vectors demonstrating their main content. Where, for each element in these vectors, its average weight has been considered instead of its frequency.

Th

... Show More
View Publication
Scopus (1)
Scopus Crossref
Publication Date
Mon Jul 01 2019
Journal Name
Arpn Journal Of Engineering And Applied Sciences
PSEUDO RANDOM NUMBER GENERATOR BASED ON NEURO-FUZZY MODELS
...Show More Authors

Producing pseudo-random numbers (PRN) with high performance is one of the important issues that attract many researchers today. This paper suggests pseudo-random number generator models that integrate Hopfield Neural Network (HNN) with fuzzy logic system to improve the randomness of the Hopfield Pseudo-random generator. The fuzzy logic system has been introduced to control the update of HNN parameters. The proposed model is compared with three state-ofthe-art baselines the results analysis using National Institute of Standards and Technology (NIST) statistical test and ENT test shows that the projected model is statistically significant in comparison to the baselines and this demonstrates the competency of neuro-fuzzy based model to produce

... Show More
View Publication
Publication Date
Mon Feb 01 2016
Journal Name
Swarm And Evolutionary Computation
Improving the performance of evolutionary multi-objective co-clustering models for community detection in complex social networks
...Show More Authors

Scopus (31)
Crossref (27)
Scopus Clarivate Crossref