Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and exploring how specific features of this new technology may transform traditional business methods. The primary objectives of this study are to summarize the significant Blockchain techniques used thus far, identify current challenges and barriers in this field, determine the limitations of each paper that could be used for future development, and assess the extent to which Blockchain and data analytics have been effectively used to evaluate performance objectively. Moreover, we aim to identify potential future research paths and suggest new criteria in this burgeoning discipline through our review. Index Terms— Blockchain, Distributed Database, Distributed Consensus, Data Analytics, Public Ledger.
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreOne wide-ranging category of open source data is that referring to geospatial information web sites. Despite the advantages of such open source data, including ease of access and cost free data, there is a potential issue of its quality. This article tests the horizontal positional accuracy and possible integration of four web-derived geospatial datasets: OpenStreetMap (OSM), Google Map, Google Earth and Wikimapia. The evaluation was achieved by combining the tested information with reference field survey data for fifty road intersections in Baghdad, Iraq. The results indicate that the free geospatial data can be used to enhance authoritative maps especially small scale maps.
This work bases on encouraging a generous and conceivable estimation for modified an algorithm for vehicle travel times on a highway from the eliminated traffic information using set aside camera image groupings. The strategy for the assessment of vehicle travel times relies upon the distinctive verification of traffic state. The particular vehicle velocities are gotten from acknowledged vehicle positions in two persistent images by working out the distance covered all through elapsed past time doing mollification between the removed traffic flow data and cultivating a plan to unequivocally predict vehicle travel times. Erbil road data base is used to recognize road locales around road segments which are projected into the commended camera
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreOne of the most significant elements influencing weather, climate, and the environment is vegetation cover. Normalized Difference Vegetation Index (NDVI) and Normalized Difference Built-up Index (NDBI) over the years 2019–2022 were estimated based on four Landsat 8 TIRS’s images covering Duhok City. Using the radiative transfer model, the city's land surface temperature (LST) during the next four years was calculated. The aim of this study is to compute the temperature at the land's surface (LST) from the years 2019-2022 and understand the link, between LST, NDVI, and NDBI and the capability for mapping by LANDSAT-8 TIRS's. The findings revealed that the NDBI and the NDVI had the strongest correlation with the
... Show MoreHartha Formation is an overburdened horizon in the X-oilfield which generates a lot of Non-Productive Time (NPT) associated with drilling mud losses. This study has been conducted to investigate the loss events in this formation as well as to provide geological interpretations based on datasets from nine wells in this field of interest. The interpretation was based on different analyses including wireline logs, cuttings descriptions, image logs, and analog data. Seismic and coherency data were also used to formulate the geological interpretations and calibrate that with the loss events of the Hartha Fm.
The results revealed that the upper part of the Hartha Fm. was identified as an interval capable of creating potentia
... Show MoreConstitutional Review of 1992 and 1996 and the role of the Party of Progress and Socialism