Social media is known as detectors platform that are used to measure the activities of the users in the real world. However, the huge and unfiltered feed of messages posted on social media trigger social warnings, particularly when these messages contain hate speech towards specific individual or community. The negative effect of these messages on individuals or the society at large is of great concern to governments and non-governmental organizations. Word clouds provide a simple and efficient means of visually transferring the most common words from text documents. This research aims to develop a word cloud model based on hateful words on online social media environment such as Google News. Several steps are involved including data acquisition and pre-processing, feature extraction, model development, visualization and viewing of word cloud model result. The results present an image in a series of text describing the top words. This model can be considered as a simple way to exchange high-level information without overloading the user's details.
Random matrix theory is used to study the chaotic properties in nuclear energy spectrum of the 24Mg nucleus. The excitation energies (which are the main object of this study) are obtained via performing shell model calculations using the OXBASH computer code together with an effective interaction of Wildenthal (W) in the isospin formalism. The 24Mg nucleus is assumed to have an inert 16O core with 8 nucleons (4protons and 4neutrons) move in the 1d5/2, 2s1/2 and 1d3/2 orbitals. The spectral fluctuations are studied by two statistical measures: the nearest neighb
Abstract The wavelet shrink estimator is an attractive technique when estimating the nonparametric regression functions, but it is very sensitive in the case of a correlation in errors. In this research, a polynomial model of low degree was used for the purpose of addressing the boundary problem in the wavelet reduction in addition to using flexible threshold values in the case of Correlation in errors as it deals with those transactions at each level separately, unlike the comprehensive threshold values that deal with all levels simultaneously, as (Visushrink) methods, (False Discovery Rate) method, (Improvement Thresholding) and (Sureshrink method), as the study was conducted on real monthly data represented in the rates of theft crimes f
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreThe research aims to measure the relationship and the impact of knowledge management processes to achieve the performance of insurance service, as well as analysis of the reality of the National Insurance Company to identify the level of overall performance, and to achieve this goal, it has been the selection of knowledge management processes according to the survey prepared a supplement to the study (Qubaisi, 2002), and of the four operations (knowledge generation, and storage of knowledge, and the distribution of knowledge, and application of knowledge), which represented the independent variable, and the performance has been the use of quantitative and qualitative measures, (sales growth, customer satisfaction), which represented the
... Show MoreThe research aims to determine the mix of production optimization in the case of several conflicting objectives to be achieved at the same time, therefore, discussions dealt with the concept of programming goals and entrances to be resolved and dealt with the general formula for the programming model the goals and finally determine the mix of production optimization using a programming model targets to the default case.