Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The negotiation phase is gaining great importance in investment loans due to the magnitude of the amounts awarded and the need to obtain adequate lending guarantees and the development of better conditions, and this requires that negotiating policy with fixed principles that balance between meeting the demands of customers and maintain market share and achieve profits and avoid defaults and losses the bank, and so I have touched Find the concept of investment loans and procedures for granting and the concept of the negotiation process for the granting of loans. The practical side to ensure the preparation of examination form consists of questions commensurate with the methodology developed by the researchers, were directed to the officia
... Show MoreObjecte The study aims to test the effect of using the appropriate quantitative method of demand forecasting in improving the performance of supply chain of the aviation fuel product ( The study sample), One of the products of the Doura refinery (The study site), By testing a set of quantitative methods of demand forecasting using forecasting error measurements, and choosing the least faulty, most accurate and reliable method and adept it in the building chain.
Is the study of problem through a starting with the fol
... Show MoreNitrogen (N) is a key growth and yield-limiting factor in cultivated rice areas. This study has been conducted to evaluate the effects of different conditions of N application on rice yield and yield components (Shiroudi cultivar) in Babol (Mazandaran, Iran) during the 2015- 2016 season. A factorial experiment executed of a Randomized Complete Block Design (RCBD) used in three iterations. In the first factor, treatments were four N amounts (including 50, 90, 130, and 170 kg N ha-1), while in the second factor, the treatments consisted of four different fertilizer splitting methods, including T1:70 % at the basal stage + 30 % at the maximum tillering stage, T2:1/3 at the basal stage + 1/3 at the maximum ti
... Show MoreImplementation of TSFS (Transposition, Substitution, Folding, and Shifting) algorithm as an encryption algorithm in database security had limitations in character set and the number of keys used. The proposed cryptosystem is based on making some enhancements on the phases of TSFS encryption algorithm by computing the determinant of the keys matrices which affects the implementation of the algorithm phases. These changes showed high security to the database against different types of security attacks by achieving both goals of confusion and diffusion.
Spatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreAmplitude variation with offset (AVO) analysis is an 1 efficient tool for hydrocarbon detection and identification of elastic rock properties and fluid types. It has been applied in the present study using reprocessed pre-stack 2D seismic data (1992, Caulerpa) from north-west of the Bonaparte Basin, Australia. The AVO response along the 2D pre-stack seismic data in the Laminaria High NW shelf of Australia was also investigated. Three hypotheses were suggested to investigate the AVO behaviour of the amplitude anomalies in which three different factors; fluid substitution, porosity and thickness (Wedge model) were tested. The AVO models with the synthetic gathers were analysed using log information to find which of these is the
... Show MoreThe estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show More