Most companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Making (MCDM) methods, the Analytic Hierarchy Process (AHP) is preferred for handling complicated decision-making challenges using several criteria. The Consistency Ratio (CR) scores were used to examine pair-wise comparisons to evaluate the AHP. This study used two judgment scales to get the most consistent judgment. Firstly, the Saaty judgment scale (SS), then the Generalized Balanced Scale (GBS). It investigated whether two different AHP judgment scales would affect decision-making. The main criteria for prioritizing pre-processing techniques in sentiment analysis are Punctuation, Spelling, Number, and Context. These four criteria also contain sub-criteria. GBS pair-wise comparisons are closer to the CR value than SS, reducing the alternatives’ weight ratios. This paper explains how AHP aids logical decision-making. Prioritizing pre-processing techniques with AHP can be a paradigm for other sentiment analysis stages. In short, this paper adds another contribution to the Big Data Analytics domain.
Crime is a threat to any nation’s security administration and jurisdiction. Therefore, crime analysis becomes increasingly important because it assigns the time and place based on the collected spatial and temporal data. However, old techniques, such as paperwork, investigative judges, and statistical analysis, are not efficient enough to predict the accurate time and location where the crime had taken place. But when machine learning and data mining methods were deployed in crime analysis, crime analysis and predication accuracy increased dramatically. In this study, various types of criminal analysis and prediction using several machine learning and data mining techniques, based o
Steganography can be defined as the art and science of hiding information in the data that could be read by computer. This science cannot recognize stego-cover and the original one whether by eye or by computer when seeing the statistical samples. This paper presents a new method to hide text in text characters. The systematic method uses the structure of invisible character to hide and extract secret texts. The creation of secret message comprises four main stages such using the letter from the original message, selecting the suitable cover text, dividing the cover text into blocks, hiding the secret text using the invisible character and comparing the cover-text and stego-object. This study uses an invisible character (white space
... Show MoreThe game theory has been applied to all situations where agents’ (people or companies) actions are utility-maximizing, and the collaborative offshoot of game theory has proven to be a robust tool for creating effective collaboration strategies in a broad range of applications. In this paper first, we employ the Banzhaf values to show the potential cost to waste producers in the case of a cooperation and to reduce the overall costs of processing non-recyclable waste during cooperation between producers. Secondly, we propose an application of the methodology to study a case for five waste producers' waste management in the Al-Mahmudiya factory with the aim of displaying the potential cost to waste producers in case of cooperatio
... Show MoreThe phenomenon of negative behavior has studied as a social and psychological phenomenon that effect on the performance and life of workers inside and outside the organization. The adoption of this phenomenon is studied in terms of the role of the internal environment of the organization in addressing this behavior, being the variables belong to the field of organizational behavior to see the results of those variables on the Iraqi organizations, since the specificities of it differ from the rest of the Arab and foreign environments. Therefore, this study focused on testing the relationship of the internal environment of the organization and its role in addressing the negative behavior of the workers.
thi
... Show MoreDue to the developments taking place in the field of communications, informatics systems and knowledge management in the current century, and the obligations and burdens imposed on the business organization to keep pace with these developments, the traditional methods of administrative decision-making are no longer feasible, as recent trends have emerged in management that focus on the need to rely on quantitative methods such as operations research.. The latter is one of the results of World War II, which appeared for the first time in Britain to manage war operations. The first method used in this field is the linear programming method. The use of operations research has developed greatly in the past years, and the methods of analysis in
... Show MoreAbstract
The research study about the empowerment as an independent variable, in which details include (training and improvement, incentives, information sharing, trust, and delegation), has also focused on the performance of the service organization as a dependent variable in all dimensions which include (improve work efficiency, building the core competencies, focus on the beneficiary of the service, increasing the feeling of satisfaction of the employees, and the organizational support commitment). The research has been based on the opinions of a chosen sample of 75 service officers of the Ministry of Interior who work at the General Directorate of Traffic. The research problem has been identified by t
... Show MoreSimulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show More