Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and Near-duplicate detection (PAN) Dataset 2009- 2011. Verbatim plagiarism, according to the researchers, plagiarism is simply copying and pasting. They then moved on to smart plagiarism, which is more challenging to spot since it might include text change, taking ideas from other academics, and translation into a more difficult-to-manage language. Other studies have found that plagiarism can obscure the scientific content of publications by swapping words, removing or adding material, or reordering or changing the original articles. This article discusses the comparative study of plagiarism detection techniques.
Plagiarism Detection Systems play an important role in revealing instances of a plagiarism act, especially in the educational sector with scientific documents and papers. The idea of plagiarism is that when any content is copied without permission or citation from the author. To detect such activities, it is necessary to have extensive information about plagiarism forms and classes. Thanks to the developed tools and methods it is possible to reveal many types of plagiarism. The development of the Information and Communication Technologies (ICT) and the availability of the online scientific documents lead to the ease of access to these documents. With the availability of many software text editors, plagiarism detections becomes a critical
... Show MoreFace detection is one of the important applications of biometric technology and image processing. Convolutional neural networks (CNN) have been successfully used with great results in the areas of image processing as well as pattern recognition. In the recent years, deep learning techniques specifically CNN techniques have achieved marvellous accuracy rates on face detection field. Therefore, this study provides a comprehensive analysis of face detection research and applications that use various CNN methods and algorithms. This paper presents ten of the most recent studies and illustrate the achieved performance of each method.
Plagiarism is described as using someone else's ideas or work without their permission. Using lexical and semantic text similarity notions, this paper presents a plagiarism detection system for examining suspicious texts against available sources on the Web. The user can upload suspicious files in pdf or docx formats. The system will search three popular search engines for the source text (Google, Bing, and Yahoo) and try to identify the top five results for each search engine on the first retrieved page. The corpus is made up of the downloaded files and scraped web page text of the search engines' results. The corpus text and suspicious documents will then be encoded as vectors. For lexical plagiarism detection, the system will
... Show MoreIn the task of detecting intrinsic plagiarism, the cases where reference corpus is absent are to be dealt with. This task is entirely based on inconsistencies within a given document. Detection of internal plagiarism has been considered as a classification problem. It can be estimated through taking into consideration self-based information from a given document.
The core contribution of the work proposed in this paper is associated with the document representation. Wherein, the document, also, the disjoint segments generated from it, have been represented as weight vectors demonstrating their main content. Where, for each element in these vectors, its average weight has been considered instead of its frequency.
Th
... Show MoreMost Internet-tomography problems such as shared congestion detection depend on network measurements. Usually, such measurements are carried out in multiple locations inside the network and relied on local clocks. These clocks usually skewed with time making these measurements unsynchronized and thereby degrading the performance of most techniques. Recently, shared congestion detection has become an important issue in many computer networked applications such as multimedia streaming and
peer-to-peer file sharing. One of the most powerful techniques that employed in literature is based on Discrete Wavelet Transform (DWT) with cross-correlation operation to determine the state of the congestion. Wavelet transform is used as a de-noisin
Abstract Software-Defined Networking (commonly referred to as SDN) is a newer paradigm that develops the concept of a software-driven network by separating data and control planes. It can handle the traditional network problems. However, this excellent architecture is subjected to various security threats. One of these issues is the distributed denial of service (DDoS) attack, which is difficult to contain in this kind of software-based network. Several security solutions have been proposed recently to secure SDN against DDoS attacks. This paper aims to analyze and discuss machine learning-based systems for SDN security networks from DDoS attack. The results have indicated that the algorithms for machine learning can be used to detect DDoS
... Show MoreWith the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.
Digital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreObjective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show More