The paired sample t-test is a type of classical test statistics that is used to test the difference between two means in paired data, but it is not robust against the violation of the normality assumption. In this paper, some alternative robust tests are suggested by combining the Jackknife resampling with each of the Wilcoxon signed-rank test for small sample size and Wilcoxon signed-rank test for large sample size, using normal approximation. The Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these tests depending on the type one error rates and the power rates of the test statistics. All these tests were applied on different sample sizes generated from three distributions, represented by Bivariate normal distribution, contaminated Bivariate normal distribution, and Bivariate exponential distribution.
The presence of natural voids and fractures (weak zones) in subsurface gypsiferous soil and gypsum, within the University of Al-Anbar, western Iraq. It causes a harsher problem for civil engineering projects. Electrical resistivity technique is applied as an economic decipher for investigation underground weak zones. The inverse models of the Dipole-dipole and Pole-dipole arrays with aspacing of 2 m and an n-factor of 6 clearly show that the resistivity contrast between the anomalous part of the weak zone and the background. The maximum thickness and shape are well defined from 2D imaging with Dipole-dipole array, the maximum thickness ranges between 9.5 to 11.5 m. It is concluded that the 2D imaging survey is a useful technique and more
... Show MoreThe two-neutron halo-nuclei (17B, 11Li, 8He) was investigated using a two-body nucleon density distribution (2BNDD) with two frequency shell model (TFSM). The structure of valence two-neutron of 17B nucleus in a pure (1d5/2) state and in a pure (1p1/2) state for 11L and 8He nuclei. For our tested nucleus, an efficient (2BNDD's) operator for point nucleon system folded with two-body correlation operator's functions was used to investigate nuclear matter density distributions, root-mean square (rms) radii, and elastic electron scattering form factors. In the nucleon-nucleon forces the correlation took account of
... Show MoreIn this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted in the theoretical cross section and compared with the experimental data for nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi
... Show MoreIn this study, the mobile phone traces concern an ephemeral event which represents important densities of people. This research aims to study city pulse and human mobility evolution that would be arise during specific event (Armada festival), by modelling and simulating human mobility of the observed region, depending on CDRs (Call Detail Records) data. The most pivot questions of this research are: Why human mobility studied? What are the human life patterns in the observed region inside Rouen city during Armada festival? How life patterns and individuals' mobility could be extracted for this region from mobile DB (CDRs)? The radius of gyration parameter has been applied to elaborate human life patterns with regards to (work, off) days for
... Show MoreBlockchain technology relies on cryptographic techniques that provide various advantages, such as trustworthiness, collaboration, organization, identification, integrity, and transparency. Meanwhile, data analytics refers to the process of utilizing techniques to analyze big data and comprehend the relationships between data points to draw meaningful conclusions. The field of data analytics in Blockchain is relatively new, and few studies have been conducted to examine the challenges involved in Blockchain data analytics. This article presents a systematic analysis of how data analytics affects Blockchain performance, with the aim of investigating the current state of Blockchain-based data analytics techniques in research fields and
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
In recent years, due to the economic benefits and technical advances of cloud
computing, huge amounts of data have been outsourced in the cloud. To protect the
privacy of their sensitive data, data owners have to encrypt their data prior
outsourcing it to the untrusted cloud servers. To facilitate searching over encrypted
data, several approaches have been provided. However, the majority of these
approaches handle Boolean search but not ranked search; a widely accepted
technique in the current information retrieval (IR) systems to retrieve only the top–k
relevant files. In this paper, propose a distributed secure ranked search scheme over
the encrypted cloud servers. Such scheme allows for the authorized user to
In the present work, different remote sensing techniques have been used to analyze remote sensing data spectrally using ENVI software. The majority of algorithms used in the Spectral Processing can be organized as target detection, change detection and classification. In this paper several methods of target detection have been studied such as matched filter and constrained energy minimization.
The water body mapping have been obtained and the results showed changes on the study area through the period 1995-2000. Also the results that obtained from applying constrained energy minimization were more accurate than other method comparing with the real situation.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show More