Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned from Twitter content without modifying the basic topic model of LSA and LDA. Users who share the same hashtag at most discuss the same topic. We compare the performance of the two methods (LSA and LDA) using the topic coherence (with and without hashtags). The experiment result on the Twitter dataset showed that LSA has better coherence score with hashtags than that do not incorporate hashtags. In contrast, our experiments show that the LDA has a better coherence score without incorporating hashtags. Finally, LDA has a better coherence score than LSA and the best coherence result obtained from the LDA method was (0.6047) and the LSA method was (0.4744) but the number of topics in LDA was higher than LSA. Thus, LDA may cause the same tweets to discuss the same subject set into different clustering.
This study aims to analyze the messages of a number of global news outlets on Twitter. In order to clarify the news outlets tactics of reporting, the subjects and focus during the crisis related to the spread of the Covid-19 virus. The study sample was chosen in a deliberate manner to provide descriptive results. Three news sites were selected: two of the most followed, professional and famous international news sites: New York Times and the Guardian, and one Arab news site: Al-Arabiya channel.
A total of 18,085 tweets were analyzed for the three accounts during the period from (1/3/2020) to (8/4/2020). A content analysis form was used to analyze the content of the news coverage. The results indicate an increase in th
... Show MoreIn drilling fluid program, selecting the drilling fluid that will reduce the lost time is the first objective, and will be economical regardless of its cost. The amount and type of solids in drilling fluid is the primary control of the rheological and filtration properties. Palygorskite clay (attapulgite) is an active solid that has the ability to reactive with its environment and form a gel structure within a fluid and due to its stability in the presence of brines and electrolytes this type of clay is preferred for use. The aim of this study is to improve properties of Iraqi palygorskite (PAL) by adding different chemical additives such as caustic soda NaOH and soda ash Na2CO3 with a different con
... Show MoreThe primary issue addressed in this research revolves around identifying the interactive elements provided by the Twitter platform and understanding their utilization by digital newspapers with official accounts. These newspapers broadcast content in line with their policies. This study is classified within descriptive research that employed a survey method and content analysis tool. The methodology relies on the "how was it said?" approach to categorize the analysis. The research yielded the following results:
Twitter utilized numerous interactive elements for disseminating tweets, which include "text, branching links, hashtags, digital images, digital videos, digital audio, and digital polls." However, thes
In this Paper, we proposed two new predictor corrector methods for solving Kepler's equation in hyperbolic case using quadrature formula which plays an important and significant rule in the evaluation of the integrals. The two procedures are developed that, in two or three iterations, solve the hyperbolic orbit equation in a very efficient manner, and to an accuracy that proves to be always better than 10-15. The solution is examined with and with grid size , using the first guesses hyperbolic eccentric anomaly is and , where is the eccentricity and is the hyperbolic mean anomaly.
The evolution of cryptography has been crucial to preservation subtle information in the digital age. From early cipher algorithms implemented in earliest societies to recent cryptography methods, cryptography has developed alongside developments in computing field. The growing in cyber threats and the increase of comprehensive digital communications have highlighted the significance of selecting effective and robust cryptographic techniques. This article reviews various cryptography algorithms, containing symmetric key and asymmetric key cryptography, via evaluating them according to security asset, complexity, and execution speed. The main outcomes demonstrate the growing trust on elliptic curve cryptography outstanding its capabi
... Show MoreWhen optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show MoreThis study investigates the impact of spatial resolution enhancement on supervised classification accuracy using Landsat 9 satellite imagery, achieved through pan-sharpening techniques leveraging Sentinel-2 data. Various methods were employed to synthesize a panchromatic (PAN) band from Sentinel-2 data, including dimension reduction algorithms and weighted averages based on correlation coefficients and standard deviation. Three pan-sharpening algorithms (Gram-Schmidt, Principal Components Analysis, Nearest Neighbour Diffusion) were employed, and their efficacy was assessed using seven fidelity criteria. Classification tasks were performed utilizing Support Vector Machine and Maximum Likelihood algorithms. Results reveal that specifi
... Show More