Social media and news agencies are major sources for tracking news and events. With these sources' massive amounts of data, it is easy to spread false or misleading information. Given the great dangers of fake news to societies, previous studies have given great attention to detecting it and limiting its impact. As such, this work aims to use modern deep learning techniques to detect Arabic fake news. In the proposed system, the attention model is adapted with bidirectional long-short-term memory (Bi-LSTM) to identify the most informative words in the sentence. Then, a multi-layer perceptron (MLP) is applied to classify news articles as fake or real. The experiments are conducted on a newly launched Arabic dataset called the Arabic Fake News Dataset (AFND). The AFDN dataset contains exactly 606912 news articles collected from multiple sources, so it is suitable for deep learning requirements. Both simple recurrent neural networks (S-RNN), long short-term memory (LSTM), and gated recurrent units (GRU) are used for comparison. According to evaluation criteria, our proposed model achieved an accuracy of (0.8127), which is the best and highest accuracy among the deep learning methods used in this work. Moreover, the performance of our proposed model is better compared to previous studies, which used the AFND.
Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreHuman action recognition has gained popularity because of its wide applicability, such as in patient monitoring systems, surveillance systems, and a wide diversity of systems that contain interactions between people and electrical devices, including human computer interfaces. The proposed method includes sequential stages of object segmentation, feature extraction, action detection and then action recognition. Effective results of human actions using different features of unconstrained videos was a challenging task due to camera motion, cluttered background, occlusions, complexity of human movements, and variety of same actions performed by distinct subjects. Thus, the proposed method overcomes such problems by using the fusion of featur
... Show MoreIn this paper an authentication based finger print biometric system is proposed with personal identity information of name and birthday. A generation of National Identification Number (NIDN) is proposed in merging of finger print features and the personal identity information to generate the Quick Response code (QR) image that used in access system. In this paper two approaches are dependent, traditional authentication and strong identification with QR and NIDN information. The system shows accuracy of 96.153% with threshold value of 50. The accuracy reaches to 100% when the threshold value goes under 50.
Digital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreThe revolution of multimedia has been a driving force behind fast and secured data transmission techniques. The security of image information from unapproved access is imperative. Encryptions technique is used to transfer data, where each kind of data has its own special elements; thus various methods should to be used to conserve distributing the image. This paper produces image encryption improvements based on proposed an approach to generate efficient intelligent session (mask keys) based on investigates from the combination between robust feature for ECC algebra and construction level in Greedy Randomized Adaptive Search Procedure (GRASP) to produce durable symmetric session mask keys consist of ECC points. Symmetric behavior for ECC
... Show MoreAlthough its wide utilization in microbial cultures, the one factor-at-a-time method, failed to find the true optimum, this is due to the interaction between optimized parameters which is not taken into account. Therefore, in order to find the true optimum conditions, it is necessary to repeat the one factor-at-a-time method in many sequential experimental runs, which is extremely time-consuming and expensive for many variables. This work is an attempt to enhance bioactive yellow pigment production by Streptomyces thinghirensis based on a statistical design. The yellow pigment demonstrated inhibitory effects against Escherichia coli and Staphylococcus aureus and was characterized by UV-vis spectroscopy which showed lambda maximum of
... Show More