Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimation schemes that select the functions most important to capture the variation in response. Through simulation studies, we validate the computational efficiency as well as predictive accuracy of our method. Finally, we present an important real-world application of the proposed methodology on a massive plant abundance dataset from Cape Floristic Region in South Africa. © 2019 Elsevier B.V.
To obtain the approximate solution to Riccati matrix differential equations, a new variational iteration approach was proposed, which is suggested to improve the accuracy and increase the convergence rate of the approximate solutons to the exact solution. This technique was found to give very accurate results in a few number of iterations. In this paper, the modified approaches were derived to give modified solutions of proposed and used and the convergence analysis to the exact solution of the derived sequence of approximate solutions is also stated and proved. Two examples were also solved, which shows the reliability and applicability of the proposed approach.
Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreSurvival analysis is one of the types of data analysis that describes the time period until the occurrence of an event of interest such as death or other events of importance in determining what will happen to the phenomenon studied. There may be more than one endpoint for the event, in which case it is called Competing risks. The purpose of this research is to apply the dynamic approach in the analysis of discrete survival time in order to estimate the effect of covariates over time, as well as modeling the nonlinear relationship between the covariates and the discrete hazard function through the use of the multinomial logistic model and the multivariate Cox model. For the purpose of conducting the estimation process for both the discrete
... Show MoreThe accuracy of the Moment Method for imposing no-slip boundary conditions in the lattice Boltzmann algorithm is investigated numerically using lid-driven cavity flow. Boundary conditions are imposed directly upon the hydrodynamic moments of the lattice Boltzmann equations, rather than the distribution functions, to ensure the constraints are satisfied precisely at grid points. Both single and multiple relaxation time models are applied. The results are in excellent agreement with data obtained from state-of-the-art numerical methods and are shown to converge with second order accuracy in grid spacing.
Compaction curves are widely used in civil engineering especially for road constructions, embankments, etc. Obtaining the precise amount of Optimum Moisture Content (OMC) that gives the Maximum Dry Unit weight gdmax. is very important, where the desired soil strength can be achieved in addition to economic aspects.
In this paper, three peak functions were used to obtain the OMC and gdmax. through curve fitting for the values obtained from Standard Proctor Test. Another surface fitting was also used to model the Ohio’s compaction curves that represent the very large variation of compacted soil types.
The results showed very good correlation between the values obtained from some publ
... Show MoreIn this research, we dealt with the study of the Non-Homogeneous Poisson process, which is one of the most important statistical issues that have a role in scientific development as it is related to accidents that occur in reality, which are modeled according to Poisson’s operations, because the occurrence of this accident is related to time, whether with the change of time or its stability. In our research, this clarifies the Non-Homogeneous hemispheric process and the use of one of these models of processes, which is an exponentiated - Weibull model that contains three parameters (α, β, σ) as a function to estimate the time rate of occurrence of earthquakes in Erbil Governorate, as the governorate is adjacent to two countr
... Show MoreThe hydraulic behavior of the flow can be changed by using large-scale geometric roughness elements in open channels. This change can help in controlling erosions and sedimentations along the mainstream of the channel. Roughness elements can be large stone or concrete blocks placed at the channel's bed to impose more resistance in the bed. The geometry of the roughness elements, numbers used, and configuration are parameters that can affect the flow's hydraulic characteristics. In this paper, velocity distribution along the flume was theoretically investigated using a series of tests of T-shape roughness elements, fixed height, arranged in three different configurations, differ in the number of lines of roughness element
... Show More