Preferred Language
Articles
/
yWH6dZkBdMdGkNqjCye0
Efficient Bayesian modeling of large lattice data using spectral properties of Laplacian matrix
...Show More Authors

Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimation schemes that select the functions most important to capture the variation in response. Through simulation studies, we validate the computational efficiency as well as predictive accuracy of our method. Finally, we present an important real-world application of the proposed methodology on a massive plant abundance dataset from Cape Floristic Region in South Africa. © 2019 Elsevier B.V.

Scopus Clarivate Crossref
View Publication
Publication Date
Mon Nov 01 2021
Journal Name
International Journal Of Nonlinear Analysis And Applications
Solution of Riccati matrix differential equation using new approach of variational ‎iteration method
...Show More Authors

To obtain the approximate solution to Riccati matrix differential equations, a new variational iteration approach was ‎proposed, which is suggested to improve the accuracy and increase the convergence rate of the approximate solutons to the ‎exact solution. This technique was found to give very accurate results in a few number of iterations. In this paper, the ‎modified approaches were derived to give modified solutions of proposed and used and the convergence analysis to the exact ‎solution of the derived sequence of approximate solutions is also stated and proved. Two examples were also solved, which ‎shows the reliability and applicability of the proposed approach. ‎

Publication Date
Sat Jan 01 2022
Journal Name
The 2nd Universitas Lampung International Conference On Science, Technology, And Environment (ulicoste) 2021
Investigation of the vibrational spectral and electronic properties for ZnxBe7-xO7 wurtzoid via DFT approximation
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Sun Sep 04 2011
Journal Name
Baghdad Science Journal
An Embedded Data Using Slantlet Transform
...Show More Authors

Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Gulf Economist
The Bayesian Estimation in Competing Risks Analysis for Discrete Survival Data under Dynamic Methodology with Application to Dialysis Patients in Basra/ Iraq
...Show More Authors

Survival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete

... Show More
View Publication Preview PDF
Publication Date
Thu Mar 06 2025
Journal Name
Aip Conference Proceedings
Solving 5th order nonlinear 4D-PDEs using efficient design of neural network
...Show More Authors

View Publication
Scopus Crossref
Publication Date
Wed Mar 01 2017
Journal Name
Archive Of Mechanical Engineering
Using the Lid-Driven Cavity Flow to Validate Moment-Based Boundary Conditions for the Lattice Boltzmann Equation
...Show More Authors
Abstract<p>The accuracy of the Moment Method for imposing no-slip boundary conditions in the lattice Boltzmann algorithm is investigated numerically using lid-driven cavity flow. Boundary conditions are imposed directly upon the hydrodynamic moments of the lattice Boltzmann equations, rather than the distribution functions, to ensure the constraints are satisfied precisely at grid points. Both single and multiple relaxation time models are applied. The results are in excellent agreement with data obtained from state-of-the-art numerical methods and are shown to converge with second order accuracy in grid spacing.</p>
View Publication
Scopus (21)
Crossref (18)
Scopus Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Engineering
Mathematical Modeling of Compaction Curve Using Normal Distribution Functions
...Show More Authors

Compaction curves are widely used in civil engineering especially for road constructions, embankments, etc. Obtaining the precise amount of Optimum Moisture Content (OMC) that gives the Maximum Dry Unit weight gdmax. is very important, where the desired soil strength can be achieved in addition to economic aspects.

In this paper, three peak functions were used to obtain the OMC and gdmax. through curve fitting for the values obtained from Standard Proctor Test. Another surface fitting was also used to model the Ohio’s compaction curves that represent the very large variation of compacted soil types.

The results showed very good correlation between the values obtained from some publ

... Show More
View Publication Preview PDF
Publication Date
Tue Jun 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Using The Maximum Likelihood And Bayesian Methods To Estimate The Time-Rate Function Of Earthquake Phenomenon
...Show More Authors

In this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jan 01 2021
Journal Name
Journal Of Engineering
A Computational Fluid Dynamics Investigation of using Large-Scale Geometric Roughness Elements in Open Channels
...Show More Authors

The hydraulic behavior of the flow can be changed by using large-scale geometric roughness elements in open channels. This change can help in controlling erosions and sedimentations along the mainstream of the channel. Roughness elements can be large stone or concrete blocks placed at the channel's bed to impose more resistance in the bed. The geometry of the roughness elements, numbers used, and configuration are parameters that can affect the flow's hydraulic characteristics. In this paper, velocity distribution along the flume was theoretically investigated using a series of tests of T-shape roughness elements, fixed height, arranged in three different configurations, differ in the number of lines of roughness element

... Show More
View Publication Preview PDF
Crossref (3)
Crossref