Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and Near-duplicate detection (PAN) Dataset 2009- 2011. Verbatim plagiarism, according to the researchers, plagiarism is simply copying and pasting. They then moved on to smart plagiarism, which is more challenging to spot since it might include text change, taking ideas from other academics, and translation into a more difficult-to-manage language. Other studies have found that plagiarism can obscure the scientific content of publications by swapping words, removing or adding material, or reordering or changing the original articles. This article discusses the comparative study of plagiarism detection techniques.
The research is concerned with studying the characteristics of Sustainable Architecture and Green Architecture, as a general research methodology related to the specific field of architecture, based on the differentiation between two generic concepts, Sustainability and Greening, to form the framework of the research specific methodology, where both concepts seem to be extremely overlapping for research centers, individuals, and relevant organizations. In this regard, the research tend towards searching their characteristics and to clearly differentiates between the two terms, particularly in architecture, where the research seeks understanding sustainable and green architectures, how they are so close or so far, and the
... Show MoreThis paper presents the intricate issues and strategies related to the translation of children's books, and it particularly focuses on the comparative analysis of "The Tale of Peter Rabbit" by Beatrix Potter and "Le Petit Prince" (The Little Prince) by Antoine de Saint-Exupéry. The study finds that the typical problems in translation are, idiomatic expressions, cultural reference, and the voice preservation, along side-sheet-specific challenges which each of the text faces. The translator of Potter's work should have skills of transposing all culturally oriented peculiarities of the UK land to the international audience to keep it accessible. On the contrary, "Le Petit Prince" translation will be the process of capturing the abstra
... Show MoreFind cares studying ways in the development of industrial products and designs: the way the progressive development (how typical) and root development (jump design), was the aim of the research: to determine the effectiveness of the pattern and the jump in the development of designs and industrial products. After a process of analysis of a sample of research and two models of contemporary household electrical appliances, it was reached a set of findings and conclusions including:1-leaping designs changed a lot of entrenched perceptions of the user on how the product works and its use and the size and shape of the product, revealing him about the possibilities of sophisticated relationships with the product, while keeping the typical desi
... Show MoreThe textbook is the primary means of creativity and thinking, which has a major role in the development of the readership and mental abilities of the student. It is the basic tool in education in Iraq for the teacher and the student, which cannot be dispensed in any educational program. The current study aimed at the book of the biology of the sixth grade of science in Iraq (comparative study). It was compared to the book of biology for the twelve grade in the Kingdom of Jordan to identify the ratio of similarity and differences between them, in addition, to identify the weaknesses in the Iraq curriculum and developing appropriate solutions and suggestions to address them. The sample was represented with books of biology (six-science cla
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show MoreFace Identification system is an active research area in these years. However, the accuracy and its dependency in real life systems are still questionable. Earlier research in face identification systems demonstrated that LBP based face recognition systems are preferred than others and give adequate accuracy. It is robust against illumination changes and considered as a high-speed algorithm. Performance metrics for such systems are calculated from time delay and accuracy. This paper introduces an improved face recognition system that is build using C++ programming language with the help of OpenCV library. Accuracy can be increased if a filter or combinations of filters are applied to the images. The accuracy increases from 95.5% (without ap
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreImage Fusion is being used to gather important data from such an input image array and to place it in a single output picture to make it much more meaningful & usable than either of the input images. Image fusion boosts the quality and application of data. The accuracy of the image that has fused depending on the application. It is widely used in smart robotics, audio camera fusion, photonics, system control and output, construction and inspection of electronic circuits, complex computer, software diagnostics, also smart line assembling robots. In this paper provides a literature review of different image fusion techniques in the spatial domain and frequency domain, such as averaging, min-max, block substitution, Intensity-Hue-Saturation(IH
... Show MoreComputer modeling has been used to investing the Coulomb coupling parameter ?. The effects of the structure parameter K, grain charge Z, plasma density N, temperature dust grain Td, on the Coulomb coupling parameter had been studied. It was seen that the ? was increasing with increasing Z and N, and decrease with increasing K and T. Also the critical value of ? that the phase transfer of the plasma state from liquid to solid was studied.