Social Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annotating the text, feature engineering is performed using techniques like term frequency/inverse document frequency (TF/IDF) and Bag of words (BOW). The relevant features are supplied to support vector machine (SVM) and Multinomial Naïve Bayesian (MNB) classifiers. The fine tuning of SVM is being done by taking kernel Linear, Poly and RBF. SVM showed better results than MNB by having precision of 70%, recall of 76.5%, F1 Score of 69.5% and overall Accuracy of 69.2%.
The research problem lies in the ambiguity of the usage of propaganda contents by two main media outlets (the Russian RT and American Alhurra) in their news coverage of the Syrian crisis through their websites and the methods used by them to convince users taking into account the mutual propaganda conflict between the United States and Russia in the war against Syria. The objectives of the research can be represented by the following: investigating the contents of American and Russian electronic propaganda towards Syrian crisis.
• Identifying the contents that received most of the coverage in the Syrian crisis by the two news outlets.
• Identifying the terms and phrases that have been most used by the websites of RT and Alhurr
l
Big data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreRationing is a commonly used solution for shortages of resources and goods that are vital for the citizens of a country. This paper identifies some common approaches and policies used in rationing as well asrisks that associated to suggesta system for rationing fuelwhichcan work efficiently. Subsequently, addressing all possible security risks and their solutions. The system should theoretically be applicable in emergency situations, requiring less than three months to implement at a low cost and minimal changes to infrastructure.
This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show More