Preferred Language
Articles
/
tRdQPo4BVTCNdQwCrD6b
Data Analytics and Blockchain: A Review
...Show More Authors

Blockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and exploring how specific features of this new technology may transform traditional business methods. The primary objectives of this study are to summarize the significant Blockchain techniques used thus far, identify current challenges and barriers in this field, determine the limitations of each paper that could be used for future development, and assess the extent to which Blockchain and data analytics have been effectively used to evaluate performance objectively. Moreover, we aim to identify potential future research paths and suggest new criteria in this burgeoning discipline through our review. Index Terms— Blockchain, Distributed Database, Distributed Consensus, Data Analytics, Public Ledger.

Crossref
View Publication
Publication Date
Fri Nov 01 2019
Journal Name
Journal Of Physics: Conference Series
Data Processing, Storage, and Analysis: Applying Computational Procedures to the Case of a Falling Weight Deflectomer (FWD)
...Show More Authors

In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assum

... Show More
Scopus Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
A Comparison between Ericson's Formulae Results and Experimental Data Using New Formulae of Single Particle Level Density
...Show More Authors

The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter  was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of  are derived from the relation between  and level density parameter . The formulae used to derive  are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on  from the Thomas-Fermi formula show a good agreement with the experimental data.

View Publication Preview PDF
Scopus Crossref
Publication Date
Mon Feb 14 2022
Journal Name
Journal Of Educational And Psychological Researches
Comparison between Rush Model Parameters to Completed and Lost Data by Different Methods of Processing Missing Data
...Show More Authors

The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition

... Show More
View Publication Preview PDF
Publication Date
Wed Dec 23 2020
Journal Name
Iraqi Journal For Electrical And Electronic Engineering
Heuristic and Meta-Heuristic Optimization Models for Task Scheduling in Cloud-Fog Systems: A Review
...Show More Authors

Nowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In th

... Show More
View Publication
Crossref (10)
Crossref
Publication Date
Wed May 04 2022
Journal Name
Int. J. Nonlinear Anal. Appl.
Knee Meniscus Segmentation and Tear Detection Based On Magnitic Resonacis Images: A Review of Literature
...Show More Authors

The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when

... Show More
Publication Date
Fri Sep 24 2021
Journal Name
Applied Sciences
The Impact of Calcitriol on Orthodontic Tooth Movement: A Cumulative Systematic Review and Meta-Analysis
...Show More Authors

A cumulative review with a systematic approach aimed to provide a comparison of studies’ investigating the possible impact of the active form of vitamin D3, calcitriol (CTL), on the tooth movement caused by orthodontic forces (OTM) by evaluating the quality of evidence, based on collating current data from animal model studies, in vivo cell culture studies, and human clinical trials. Methods: A strict systematic review protocol was applied following the application of the International Prospective Register of Systematic Reviews (PROSPERO). A structured search strategy, including main keywords, was defined during detailed search with the application of electronic database systems: Medline/Pubmed, EMBASE, Scopus, Web of Science, and

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (9)
Scopus Clarivate Crossref
Publication Date
Thu Jan 30 2020
Journal Name
Telecommunication Systems
Nature-inspired optimization algorithms for community detection in complex networks: a review and future trends
...Show More Authors

View Publication
Scopus (28)
Crossref (23)
Scopus Clarivate Crossref
Publication Date
Fri Jul 21 2023
Journal Name
Journal Of Engineering
A Modified 2D-Checksum Error Detecting Method for Data Transmission in Noisy Media
...Show More Authors

In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus