The science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within average of 65 and MSE within average of 85, indicate that the proposed method has proven successful in its ability to efficiently embedding data.
In this article, performing and deriving te probability density function for Rayleigh distribution is done by using ordinary least squares estimator method and Rank set estimator method. Then creating interval for scale parameter of Rayleigh distribution. Anew method using is used for fuzzy scale parameter. After that creating the survival and hazard functions for two ranking functions are conducted to show which one is beast.
The present paper concern with minimax shrinkage estimator technique in order to estimate Burr X distribution shape parameter, when prior information about the real shape obtainable as original estimate while known scale parameter.
Derivation for Bias Ratio, Mean squared error and the Relative Efficiency equations.
Numerical results and conclusions for the expressions mentioned above were displayed. Comparisons for proposed estimator with most recent works were made.
In this research estimated the parameters of Gumbel distribution Type 1 for Maximum values through the use of two estimation methods:- Moments (MoM) and Modification Moments(MM) Method. the Simulation used for comparison between each of the estimation methods to reach the best method to estimate the parameters where the simulation was to generate random data follow Gumbel distributiondepending on three models of the real values of the parameters for different sample sizes with samples of replicate (R=500).The results of the assessment were put in tables prepared for the purpose of comparison, which made depending on the mean squares error (MSE).
The gravity anomalies of the Jurassic and deep structures were obtained by stripping the gravity effect of Cretaceous and Tertiary formations from the available Bouguer gravity map in central and south Iraq. The gravity effect of the stripped layers was determined depending on the density log or the density density obtained from the sonic log. The density relation with the seismic velocity of Gardner et al (1974) was used to obtain density from sonic logs in case of a lack of density log. The average density of the Cretaceous and Tertiary formation were determined then the density contrast of these formations was obtained. The density contrast and thickness of all stratigraphic formations in the area between the sea level to t
... Show MoreGroundwater is considered as one of the most important sources of fresh-water, on which many regions around the world depend, especially in semi-arid and arid regions. Protecting and maintaining groundwater is a difficult process, but it is very important to maintain an important source of water. The current study aims to assess the susceptibility of groundwater to pollution using the DRASTIC model along with the GIS environments and its tool boxes. A vulnerability map was created by relying on data collected from 55 wells surveyed by the researchers as well as archived records from governmental institutions and some international organizations. The results indicate that the region falls into three vulnerability functional zones , namely
... Show MoreThe paired sample t-test for testing the difference between two means in paired data is not robust against the violation of the normality assumption. In this paper, some alternative robust tests have been suggested by using the bootstrap method in addition to combining the bootstrap method with the W.M test. Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these three tests depending on type one error rates and the power rates of the test statistics. The three tests have been applied on different sample sizes generated from three distributions represented by Bivariate normal distribution, Bivariate contaminated normal distribution, and the Bivariate Exponential distribution.
This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show More