Preferred Language
Articles
/
iBZAn4oBVTCNdQwC3qFL
Twitter Location-Based Data: Evaluating the Methods of Data Collection Provided by Twitter Api
...Show More Authors

Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based data provided by Twitter API, Twitter places and Geocode parameters. We studied these methods to determine their accuracy and their suitability for research. The study concludes that the places method is the more accurate, but it excludes a lot of the data, while the geocode method provides us with more data, but special attention needs to be paid to outliers. Copyright © Research Institute for Intelligent Computer Systems, 2018. All rights reserved.

Scopus Crossref
View Publication
Publication Date
Thu Mar 29 2018
Journal Name
Construction Research Congress 2018
Validation of Time-Safety Influence Curve Using Empirical Safety and Injury Data—Poisson Regression
...Show More Authors

View Publication
Scopus (5)
Crossref (5)
Scopus Crossref
Publication Date
Wed Jan 31 2024
Journal Name
Iraqi Geological Journal
Estimation of Rock Mechanical Properties of the Hartha Formation and their Relationship to Porosity Using Well-Log Data
...Show More Authors

The physical and elastic characteristics of rocks determine rock strengths in general. Rock strength is frequently assessed using porosity well logs such as neutron and sonic logs. The essential criteria for estimating rock mechanic parameters in petroleum engineering research are uniaxial compressive strength and elastic modulus. Indirect estimation using well-log data is necessary to measure these variables. This study attempts to create a single regression model that can accurately forecast rock mechanic characteristics for the Harth Carbonate Formation in the Fauqi oil field. According to the findings of this study, petrophysical parameters are reliable indexes for determining rock mechanical properties having good performance p

... Show More
View Publication
Scopus (6)
Scopus Crossref
Publication Date
Tue Mar 08 2022
Journal Name
Multimedia Tools And Applications
Comparison study on the performance of the multi classifiers with hybrid optimal features selection method for medical data diagnosis
...Show More Authors

View Publication
Scopus (3)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Fri Jul 12 2024
Journal Name
World Water Policy
The effect of natural factors on changing soil uses in the marshes: An experimental study using Landsat satellite data
...Show More Authors

The study aimed to analyze the effect of meteorological factors (rainfall rate and temperature) on the change in land use in the marshes of the Al‐Majar Al‐Kabir region in southern Iraq. Satellite images from Landsat 7 for 2012 and Landsat 8 for 2022 were used to monitor changes in the land coverings, the images taken from the Enhanced Thematic Mapper Plus (ETM+) and Operational Land Imager (OLI) sensors of the Landsat satellite. Geometric correction was used to convert images into a format with precise geographic coordinates using ArcMap 10.5. The maximum likelihood classification method was used to examine satellite image data using a supervised approach, and the data were analyzed statistically. We obtained clear images of the area,

... Show More
View Publication
Scopus Crossref
Publication Date
Thu Sep 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Analysis of the indicators of the educational process and scientific levelUsing the analysis of variance of ordered data in repeated measurements
...Show More Authors

In this research want to make analysis for some indicators and it's classifications that related with the teaching process and the            scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions  for the                         

important classifications for each indicator that has affected on   the teaching process.         &nb

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Journal Of Intelligent Systems
Void-hole aware and reliable data forwarding strategy for underwater wireless sensor networks
...Show More Authors
Abstract<p>Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co</p> ... Show More
View Publication Preview PDF
Scopus (8)
Crossref (5)
Scopus Clarivate Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More
Publication Date
Wed Feb 06 2013
Journal Name
Eng. & Tech. Journal
A proposal to detect computer worms (malicious codes) using data mining classification algorithms
...Show More Authors

Malicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete

... Show More