Preferred Language
Articles
/
tRdQPo4BVTCNdQwCrD6b
Data Analytics and Blockchain: A Review
...Show More Authors

Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and exploring how specific features of this new technology may transform traditional business methods. The primary objectives of this study are to summarize the significant Blockchain techniques used thus far, identify current challenges and barriers in this field, determine the limitations of each paper that could be used for future development, and assess the extent to which Blockchain and data analytics have been effectively used to evaluate performance objectively. Moreover, we aim to identify potential future research paths and suggest new criteria in this burgeoning discipline through our review. Index Terms— Blockchain, Distributed Database, Distributed Consensus, Data Analytics, Public Ledger.

Crossref
View Publication
Publication Date
Sun Sep 24 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Iris Data Compression Based on Hexa-Data Coding
...Show More Authors

Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin

... Show More
View Publication
Crossref
Publication Date
Thu Dec 01 2022
Journal Name
Iraqi Journal Of Statistical Sciences
Use The Coiflets and Daubechies Wavelet Transform To Reduce Data Noise For a Simple Experiment
...Show More Authors

In this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that

... Show More
Publication Date
Fri Jan 01 2021
Journal Name
International Journal Of Agricultural And Statistical Sciences
A noval SVR estimation of figarch modal and forecasting for white oil data in Iraq
...Show More Authors

The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals

... Show More
View Publication Preview PDF
Scopus
Publication Date
Fri Jan 01 2021
Journal Name
Ieee Access
Keratoconus Severity Detection From Elevation, Topography and Pachymetry Raw Data Using a Machine Learning Approach
...Show More Authors

View Publication
Scopus (19)
Crossref (17)
Scopus Clarivate Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Aip Conference Proceedings
A multivariate Bayesian model using Gibbs sampler with real data application
...Show More Authors

In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Mar 01 2022
Journal Name
International Journal Of Nonlinear Analysis And Applications
The suggested threshold to reduce data noise for a factorial experiment
...Show More Authors

In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b

... Show More
Publication Date
Sun Mar 15 2020
Journal Name
Journal Of The College Of Education For Women
Data-Driven Approach for Teaching Arabic as a Foreign Language: Eygpt
...Show More Authors

Corpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language.  In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago

... Show More
View Publication Preview PDF
Publication Date
Mon Jun 19 2023
Journal Name
Journal Of Engineering
A Multi-variables Multi -sites Model for Forecasting Hydrological Data Series
...Show More Authors

A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Jun 01 2021
Journal Name
International Journal Of Aquatic Sience
Helminths and their fish hosts as bioindicators of heavy metal pollution: A review
...Show More Authors

View Publication Preview PDF
Publication Date
Sun Oct 01 2023
Journal Name
Fuel
Matrix acidizing in carbonate rocks and the impact on geomechanical properties: A review
...Show More Authors

Acid treatment is a widely used stimulation technique in the petroleum industry. Matrix acidizing is regarded as an effective and efficient acidizing technique for carbonate formations that leads to increase the fracture propagation, repair formation damage, and increase the permeability of carbonate rocks. Generally, the injected acid dissolves into the rock minerals and generates wormholes that modify the rock structure and enhance hydrocarbon production. However, one of the key issues is the associated degradation in the mechanical properties of carbonate rocks caused by the generated wormholes, which may significantly reduce the elastic properties and hardness of rocks. There have been several experimental and simulation studies regardi

... Show More
View Publication
Scopus (27)
Crossref (22)
Scopus Clarivate Crossref