Currently, with the huge increase in modern communication and network applications, the speed of transformation and storing data in compact forms are pressing issues. Daily an enormous amount of images are stored and shared among people every moment, especially in the social media realm, but unfortunately, even with these marvelous applications, the limited size of sent data is still the main restriction's, where essentially all these applications utilized the well-known Joint Photographic Experts Group (JPEG) standard techniques, in the same way, the need for construction of universally accepted standard compression systems urgently required to play a key role in the immense revolution. This review is concerned with Differential pulse code modulation (DPCM) and pixel-based techniques, where the spatial domain is exploited to compress images efficiently in terms of compression performance and preserving quality. The new pixel-based method overcomes predictive coding constraints with fewer residues and higher compression ratios.
Social media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acq
... Show MoreGovernmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicate
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreIn data mining and machine learning methods, it is traditionally assumed that training data, test data, and the data that will be processed in the future, should have the same feature space distribution. This is a condition that will not happen in the real world. In order to overcome this challenge, domain adaptation-based methods are used. One of the existing challenges in domain adaptation-based methods is to select the most efficient features so that they can also show the most efficiency in the destination database. In this paper, a new feature selection method based on deep reinforcement learning is proposed. In the proposed method, in order to select the best and most appropriate features, the essential policies
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreTraffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-ho
... Show MoreIn this work, a chemical optical fiber sensor based on Surface Plasmon Resonance (SPR) was designed and implemented using plastic optical fiber. The sensor is used for estimating refractive indices and concentrations of various chemical materials (methanol, distilled water, ethanol, kerosene) as well as for evaluating the performance parameters such as sensitivity, signal to noise ratio, resolution and the figure of merit of the fabricated sensor. It was found that the value of the sensitivity of the optical fiber-based SPR sensor, with 40 nm thick and 10 mm long Au metal film of exposed sensing region, was 3μm/RIU, while the SNR was 0.24, the figure of merit was 20, and the resolution was 0.00066. The sort of optical fiber utilized i
... Show MoreDrought is a complex phenomenon that has severe impacts on the environment. Vegetation and its conditions are very sensitive to drought effects. This study aimed to monitor and assess the drought severity and its relationships to some ecological variables in ten districts of Erbil Governorate (Kurdistan Region), Iraq, throughout 20 years (1998-2017). The results revealed that droughts frequently hit Erbil throughout the study period. The Landsat time-series- based on Vegetation Condition Index (VCI) significantly correlated with precipitation, Digital Elevation Model (DEM), and latitude. Extreme VCI-based drought area percentages were recorded in 1999, 2000, 2008, and 2011 by 43.4%, 67.9%, 43.3%, and 40.0%, respe
... Show MoreIn computer vision, visual object tracking is a significant task for monitoring
applications. Tracking of object type is a matching trouble. In object tracking, one
main difficulty is to select features and build models which are convenient for
distinguishing and tracing the target. The suggested system for continuous features
descriptor and matching in video has three steps. Firstly, apply wavelet transform on
image using Haar filter. Secondly interest points were detected from wavelet image
using features from accelerated segment test (FAST) corner detection. Thirdly those
points were descripted using Speeded Up Robust Features (SURF). The algorithm
of Speeded Up Robust Features (SURF) has been employed and impl