Plagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and Near-duplicate detection (PAN) Dataset 2009- 2011. Verbatim plagiarism, according to the researchers, plagiarism is simply copying and pasting. They then moved on to smart plagiarism, which is more challenging to spot since it might include text change, taking ideas from other academics, and translation into a more difficult-to-manage language. Other studies have found that plagiarism can obscure the scientific content of publications by swapping words, removing or adding material, or reordering or changing the original articles. This article discusses the comparative study of plagiarism detection techniques.
There has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide very low bit error rates (BER) along with information security. This paper investigates the feasibility of using chaotic communications over Multiple-Input Multiple-Output (MIMO) channels by combining chaos modulation with a suitable Space Time Block Code (STBC). It is well known that the use of Chaotic Modulation techniques can enhance communication security. However, the performance of systems using Chaos modulation has been observed to be inferior in BER performance as compared to conventional communication
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreIn this research, we use fuzzy nonparametric methods based on some smoothing techniques, were applied to real data on the Iraqi stock market especially the data about Baghdad company for soft drinks for the year (2016) for the period (1/1/2016-31/12/2016) .A sample of (148) observations was obtained in order to construct a model of the relationship between the stock prices (Low, high, modal) and the traded value by comparing the results of the criterion (G.O.F.) for three techniques , we note that the lowest value for this criterion was for the K-Nearest Neighbor at Gaussian function .
As computers become part of our everyday life, more and more people are experiencing a
variety of ocular symptoms related to computer use. These include eyestrain, tired eyes, irritation,
redness, blurred vision, and double vision, collectively referred to as computer vision syndrome.
The effect of CVS to the body such as back and shoulder pain, wrist problem and neck pain.
Many risk factors are identified in this paper.
Primary prevention strategies have largely been confined to addressing environmental
exposure to ergonomic risk factors, since to date, no clear cause for this work-related neck pain
has been acknowledged. Today, millions of children use computers on a daily basis. Extensive
viewing of the compute
This paper aims to develop a technique for helping disabled people elderly with physical disability, such as those who are unable to move hands and cannot speak howover, by using a computer vision; real time video and interaction between human and computer where these combinations provide a promising solution to assist the disabled people. The main objective of the work is to design a project as a wheelchair which contains two wheel drives. This project is based on real time video for detecting and tracking human face. The proposed design is multi speed based on pulse width modulation(PWM), technique. This project is a fast response to detect and track face direction with four operations movement (left, right, forward and stop). These opera
... Show MoreThe aim of this work is to develop an axi-symmetric two dimensional model based on a coupled simplified computational fluid dynamics (CFD) and Lagrangian method to predict the air flow patterns and drying of particles. Then using this predictive tool to design more efficient spray dryers. The approach to this is to model what particles experience in the drying chamber with respect to air temperature and humidity. These histories can be obtained by combining the particles trajectories with the air temperature/humidity pattern in the spray dryer. Results are presented and discussed in terms of the air velocity, temperature, and humidity profiles within the chambers and compared for drying of a 42.5% solids solution in a spray chamber
... Show MoreBackground: The risk of antibiotics resistance (AR) increases due to excessive of antibiotics either by health care provider or by the patients.
Objective: The assessment of the self-medication Practice of over the counter drugs and other prescription drugs and its associated risk factor.
Subjects and Methods: Study design: A descriptive study was conducted from “20th December 2019 to 08th January 2021”. A pre validated and structured questionnaire in English and Urdu language was created to avoid language barrier including personal detail, reasons and source and knowledge about over the counter drugs and Antibiotics. Sample of the study was randomly selected.
... Show MoreThe Hubble telescope is characterized by the accuracy of the image formed in it, as a result of the fact that the surrounding environment is free of optical pollutants. Such as atmospheric gases and dust, in addition to light pollution emanating from industrial and natural light sources on the earth's surface. The Hubble telescope has a relatively large objective lens that provides appropriate light to enter the telescope to get a good image. Because of the nature of astronomical observation, which requires sufficient light intensity emanating from celestial objects (galaxies, stars, planets, etc.). The Hubble telescope is classified as type of the Cassegrain reflecting telescopes, which gives it the advantage of eliminating chromat
... Show MoreThe normalized difference vegetation index (NDVI) is an effective graphical indicator that can be used to analyze remote sensing measurements using a space platform, in order to investigate the trend of the live green vegetation in the observed target. In this research, the change detection of vegetation in Babylon city was done by tracing the NDVI factor for temporal Landsat satellite images. These images were used and utilized in two different terms: in March 19th in 2015 and March 5th in 2020. The Arc-GIS program ver. 10.7 was adopted to analyze the collected data. The final results indicate a spatial variation in the (NDVI), where it increases from (1666.91 𝑘𝑚2) in 2015 to (1697.01 𝑘𝑚2)) in 2020 between the t
... Show MoreThe current study performed in order to detect and quantify epicatechin in two tea samples of Camellia sinensis (black and green tea) by thin layer chromatography (TLC) and high performance liquid chromatography (HPLC). Extraction of epicatechin from black and green tea was done by using two different methods: maceration (cold extraction method) and decoction (hot extraction method) involved using three different solvents which are absolute ethanol, 50% aqueous ethanol and water for both extraction methods using room temperature and direct heat respectively. Crude extracts of two tea samples that obtained from two methods were fractionated by using two solvents with different polarity (chloroform and
... Show More