Preferred Language
Articles
/
iBZAn4oBVTCNdQwC3qFL
Twitter Location-Based Data: Evaluating the Methods of Data Collection Provided by Twitter Api

Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based data provided by Twitter API, Twitter places and Geocode parameters. We studied these methods to determine their accuracy and their suitability for research. The study concludes that the places method is the more accurate, but it excludes a lot of the data, while the geocode method provides us with more data, but special attention needs to be paid to outliers. Copyright © Research Institute for Intelligent Computer Systems, 2018. All rights reserved.

Scopus Crossref
View Publication
Publication Date
Tue May 30 2023
Journal Name
Iraqi Journal Of Science
A Review of Assured Data Deletion Security Techniques in Cloud Storage

      Cloud computing is an interesting technology that allows customers to have convenient, on-demand network connectivity based on their needs with minimal maintenance and contact between cloud providers. The issue of security has arisen as a serious concern, particularly in the case of cloud computing, where data is stored and accessible via the Internet from a third-party storage system. It is critical to ensure that data is only accessible to the appropriate individuals and that it is not stored in third-party locations. Because third-party services frequently make backup copies of uploaded data for security reasons, removing the data the owner submits does not guarantee the removal of the data from the cloud. Cloud data storag

... Show More
Scopus Crossref
View Publication Preview PDF
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Improved Firefly Algorithm with Variable Neighborhood Search for Data Clustering

Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the

... Show More
Scopus (9)
Crossref (3)
Scopus Clarivate Crossref
View Publication Preview PDF
Publication Date
Wed May 25 2022
Journal Name
Iraqi Journal Of Science
Using Persistence Barcode to Show the Impact of Data Complexity on the Neural Network Architecture

    It is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing  mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod

... Show More
Scopus (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Wed Jun 26 2019
Journal Name
Iraqi Journal Of Science
Random Noise Attenuation by using FDNAT Filter on Seismic Data in East Diwaniya, South Eastern – Iraq

The frequency dependent noise attenuation (FDNAT) filter was applied on 2D seismic data line DE21 in east Diwaniya, south eastern Iraq to improve the signal to noise ratio. After applied FDNAT on the seismic data, it gives good results and caused to remove a lot of random noise. This processing is helpful in enhancement the picking of the signal of the reflectors and therefore the interpretation of data will be easy later. The quality control by using spectrum analysis is used as a quality factor in proving the effects of FDNAT filter to remove the random noise.

Scopus (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Sun Jul 31 2022
Journal Name
Iraqi Journal Of Science
A Review of Data Mining and Knowledge Discovery Approaches for Bioinformatics

     This review explores the Knowledge Discovery Database (KDD) approach, which supports the bioinformatics domain to progress efficiently, and illustrate their relationship with data mining. Thus, it is important to extract advantages of Data Mining (DM) strategy management such as effectively stressing its role in cost control, which is the principle of competitive intelligence, and the role of it in information management. As well as, its ability to discover hidden knowledge. However, there are many challenges such as inaccurate, hand-written data, and analyzing a large amount of variant information for extracting useful knowledge by using DM strategies. These strategies are successfully applied in several applications as data wa

... Show More
Scopus (1)
Crossref (2)
Scopus Crossref
View Publication
Publication Date
Fri Nov 11 2022
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
The initiative of the Central Bank of Iraq and its impact on some banking activities provided by specialized banks

Specialized banks provide their banking activities to their customers at interest rates that are determined according to the approved bank policy, which is almost similar to most or most banks. To satisfy the financial desires of customers and at the same time it is a source of the bank's profits, However, these banks have been introduced to new services that they provide with the funds of the Central Bank initiative launched at the beginning of (2016) to address the economic stagnation that befell the country due to the (financial security) crisis that the country faced in 2014. To put forward initiatives amounting to nearly (15) trillion dinars, which were put forward through private commercial and Islamic banks and specialized

... Show More
View Publication Preview PDF
Publication Date
Fri Apr 30 2021
Journal Name
Iraqi Journal Of Science
Iris Identification Based on the Fusion of Multiple Methods

Iris recognition occupies an important rank among the biometric types of approaches as a result of its accuracy and efficiency. The aim of this paper is to suggest a developed system for iris identification based on the fusion of scale invariant feature transforms (SIFT) along with local binary patterns of features extraction. Several steps have been applied. Firstly, any image type was converted to  grayscale. Secondly, localization of the iris was achieved using circular Hough transform. Thirdly, the normalization to convert the polar value to Cartesian using Daugman’s rubber sheet models, followed by histogram equalization to enhance the iris region. Finally, the features were extracted by utilizing the scale invariant feature

... Show More
Scopus (1)
Crossref (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re

... Show More
Scopus (1)
Scopus
Preview PDF
Publication Date
Fri Jan 01 2021
Journal Name
Review Of International Geographical Education
Evaluating the performance of project management using network diagrams methods: A case study in the Ramadi Municipality

This study came for the reason that some project administrations still do not follow the appropriate scientific methods that enable them to perform their work in a manner that achieves the goals for which those projects arise, in addition to exceeding the planned times and costs, so this study aims to apply the methods of network diagrams in Planning, scheduling and monitoring the project of constructing an Alzeuot intersection bridge in the city of Ramadi, as the research sample, being one of the strategic projects that are being implemented in the city of Ramadi, as well as being one of the projects that faced during its implementation Several of problems, the project problem was studied according to scientific methods through the applica

... Show More
Scopus (1)
Scopus
Publication Date
Sat Sep 30 2023
Journal Name
Iraqi Journal Of Science
Review of Challenges and Solutions for Genomic Data Privacy-Preserving

     The dramatic decrease in the cost of genome sequencing over the last two decades has led to an abundance of genomic data. This data has been used in research related to the discovery of genetic diseases and the production of medicines. At the same time, the huge space for storing the genome (2–3 GB) has led to it being considered one of the most important sources of big data, which has prompted research centers concerned with genetic research to take advantage of the cloud and its services in storing and managing this data. The cloud is a shared storage environment, which makes data stored in it vulnerable to unwanted tampering or disclosure. This leads to serious concerns about securing such data from tampering and unauthoriz

... Show More
Scopus Crossref
View Publication Preview PDF