Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and exploring how specific features of this new technology may transform traditional business methods. The primary objectives of this study are to summarize the significant Blockchain techniques used thus far, identify current challenges and barriers in this field, determine the limitations of each paper that could be used for future development, and assess the extent to which Blockchain and data analytics have been effectively used to evaluate performance objectively. Moreover, we aim to identify potential future research paths and suggest new criteria in this burgeoning discipline through our review. Index Terms— Blockchain, Distributed Database, Distributed Consensus, Data Analytics, Public Ledger.
In the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the veloci
... Show MoreBackground:
generally genetic disorders are a leading cause of spontaneous abortion,
It is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreIn our research, several different Statics solutions have been implemented in the processing of seismic data in the south of Iraq for (2D) line seismic survey (AK18) of Abu-khama project with length 32.4 Km and their corresponding results have been compared in order to find optimum static solutions. The static solutions based on the tomographic-principle or combining the low frequency components of field statics with high frequency ones of refraction statics can provide a reasonable static solution for seismic data in the south of Iraq. The quality of data was bad and unclear in the seismic signal, but after applying field statics there is an enhancement of data quality. The Residual static correction improved the qualities of seis
... Show MoreThe gravity and magnetic data of Tikrit-Kirkuk area in central Iraq were considered to study the tectonic situation in the area. The residual anomalies were separated from regional using space windows method with space of about 24, 12 and 10km to delineate the source level of the residual anomalies. The Total Horizontal Derivative (THD) is used to identify the fault trends in the basement and sedimentary rocks depending upon gravity and magnetic data. The identified faults in the study area show (NW-SE), less common (NE-SW) and rare (N-S) trends. Some of these faults extending from the basement to the upper most layer of the sedimentary rocks. It was found that the depth of some gravity and magnetic source range 12-13Km, which confirm th
... Show MoreIn this paper, ARIMA model was used for Estimating the missing data(air temperature, relative humidity, wind speed) for mean monthly variables in different time series at three stations (Sinjar, Baghdad , AL.Hai) which represented different parts of Iraq from north to south respectively
The Arabic grammatical theory is characterized by the characteristics that distinguish it from other languages. It is based on the following equation: In its entirety a homogeneous linguistic system that blends with the social nature of the Arab, his beliefs, and his culture.
This means that this theory was born naturally, after the labor of maintaining an integrated inheritance, starting with its legal text (the Koran), and ends with its features of multiple attributes.
Saber was carrying the founding crucible of that theory, which takes over from his teacher, Hebron, to be built on what it has reached. It is redundant to point to his location and the status of his book.
So came to my research tagged: (c
Nowadays, the process of ontology learning for describing heterogeneous systems is an influential phenomenon to enhance the effectiveness of such systems using Social Network representation and Analysis (SNA). This paper presents a novel scenario for constructing adaptive architecture to develop community performance for heterogeneous communities as a case study. The crawling of the semantic webs is a new approach to create a huge data repository for classifying these communities. The architecture of the proposed system involves two cascading modules in achieving the ontology data, which is represented in Resource Description Framework (RDF) format. The proposed system improves the enhancement of these environments ach
... Show More3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show MoreThe study area lies in the northern part of Iraq, This study depends on one scene of Thematic Mapper (TM5) data of Landsat, these data are subset by using region of interest (ROI) file within the ERDAS 9.2 software. RS and GIS have been used as tools for detecting the desertification during the periods 1990-2000-2009 by using Normalized Difference Vegetation Index NDVI, Water Index WI and Barren Land Index BLI. The indicators of Desertification which used in this study for period 1990-2000 and 2000-2009 are represented by decrease the vegetation cover and increase water body and barren land.