Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimation schemes that select the functions most important to capture the variation in response. Through simulation studies, we validate the computational efficiency as well as predictive accuracy of our method. Finally, we present an important real-world application of the proposed methodology on a massive plant abundance dataset from Cape Floristic Region in South Africa. © 2019 Elsevier B.V.
In This research a Spectroscopic complement and Thermodynamic properties for molecule PO2 were studied . That included a calculation of potential energy . From the curve of total energy for molecule at equilibrium distance , for bond (P-O), the degenerated of bond energy was (4.332eV) instate of the vibration modes of ( PO2 ) molecule and frequency that was found active in IR spectra because variable inpolarization and dipole moment for molecule. Also we calculate some thermodynamic parameters of ( PO2 ) such as heat of formation , enthalpy , heat Of capacity , entropy and gibb's free energy Were ( -54.16 kcal/mol , 2366.45 kcal/mol , 10.06 kcal /k/mol , 59.52 k
... Show MoreThe fluorescence emission of Rhodamine 6G (R6G) and Acriflavine dyes in PMMA polymer have been studied by changing the irradiation and exposure time of laser light to know the effect of this parameter. It was found that the fluorescence intensity decreases in the polymer samples doped dyes as the exposure time increases and then reaches stabilization at long times, this behavior called photobleaching, which have been shown in liquid phase less than solid phase. Using 2nd harmonic with wavelength 530 nm laser, the photobleaching effect in the two dye-doped polymers different solvent but same was studied. It was observed that photobleaching of by different solution and by using dip spin coating the photobleaching seem in liquid phase more
... Show MoreThe fluorescence emission of Rhodamine 6G (R6G) and Acriflavine dyes in PMMA polymer have been studied by changing the irradiation and exposure time of laser light to know the effect of this parameter. It was found that the fluorescence intensity decreases in the polymer samples doped dyes as the exposure time increases and then reaches stabilization at long times, this behavior called photobleaching, which have been shown in liquid phase less than solid phase. Using 2nd harmonic with wavelength 530 nm laser, the photobleaching effect in the two dye-doped polymers different solvent but same was studied. It was observed that photobleaching of by different solution and by using dip spin coating the photobleaching seem in liquid phase
... Show MoreResearch on the automated extraction of essential data from an electrocardiography (ECG) recording has been a significant topic for a long time. The main focus of digital processing processes is to measure fiducial points that determine the beginning and end of the P, QRS, and T waves based on their waveform properties. The presence of unavoidable noise during ECG data collection and inherent physiological differences among individuals make it challenging to accurately identify these reference points, resulting in suboptimal performance. This is done through several primary stages that rely on the idea of preliminary processing of the ECG electrical signal through a set of steps (preparing raw data and converting them into files tha
... Show MoreQuantitative real-time Polymerase Chain Reaction (RT-qPCR) has become a valuable molecular technique in biomedical research. The selection of suitable endogenous reference genes is necessary for normalization of target gene expression in RT-qPCR experiments. The aim of this study was to determine the suitability of each 18S rRNA and ACTB as internal control genes for normalization of RT-qPCR data in some human cell lines transfected with small interfering RNA (siRNA). Four cancer cell lines including MCF-7, T47D, MDA-MB-231 and Hela cells along with HEK293 representing an embryonic cell line were depleted of E2F6 using siRNA specific for E2F6 compared to negative control cells, which were transfected with siRNA not specific for any gene. Us
... Show MorePurpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show More