Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Abstract: -
The concept of joint integration of important concepts in macroeconomic application, the idea of cointegration is due to the Granger (1981), and he explained it in detail in Granger and Engle in Econometrica (1987). The introduction of the joint analysis of integration in econometrics in the mid-eighties of the last century, is one of the most important developments in the experimental method for modeling, and the advantage is simply the account and use it only needs to familiarize them selves with ordinary least squares.
Cointegration seen relations equilibrium time series in the long run, even if it contained all the sequences on t
... Show MoreAn oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification
... Show MoreEndoscopy is a rapidly growing field of Neurosurgery, it is defined as the applying of endoscope to treat different conditions of brain pathology within cerebral ventricular system and beyond it, endoscopic procedures performed by using different equipment and recording system to make a better visualization enhancing the surgeon's view by increasing illumination and magnification to look around corner and to capture image on video or digital format for later studies.
Photonic Crystal Fiber (PCF) based on the Surface Plasmon Resonance (SPR) effect has been proposed to detect polluted water samples. The sensing characteristics are illustrated using the finite element method. The right hole of the right side of PCF core has been coated with chemically stable gold material to achieve the practical sensing approach. The performance parameter of the proposed sensor is investigated in terms of wavelength sensitivity, amplitude sensitivity, sensor resolution, and linearity of the resonant wavelength with the variation of refractive index of analyte. In the sensing range of 1.33 to 1.3624, maximum sensitivities of 1360.2 nm ∕ RIU and 184 RIU−1 are achieved with the high sensor resolutions of 7
... Show MoreRecurrent strokes can be devastating, often resulting in severe disability or death. However, nearly 90% of the causes of recurrent stroke are modifiable, which means recurrent strokes can be averted by controlling risk factors, which are mainly behavioral and metabolic in nature. Thus, it shows that from the previous works that recurrent stroke prediction model could help in minimizing the possibility of getting recurrent stroke. Previous works have shown promising results in predicting first-time stroke cases with machine learning approaches. However, there are limited works on recurrent stroke prediction using machine learning methods. Hence, this work is proposed to perform an empirical analysis and to investigate machine learning al
... Show MoreThe development that solar energy will have in the next years needs a reliable estimation of available solar energy resources. Several empirical models have been developed to calculate global solar radiation using various parameters such as extraterrestrial radiation, sunshine hours, albedo, maximum temperature, mean temperature, soil temperature, relative humidity, cloudiness, evaporation, total perceptible water, number of rainy days, and altitude and latitude. In present work i) First part has been calculated solar radiation from the daily values of the hours of sun duration using Angstrom model over the Iraq for at July 2017. The second part has been mapping the distribution of so
In this paper, a procedure to establish the different performance measures in terms of crisp value is proposed for two classes of arrivals and multiple channel queueing models, where both arrival and service rate are fuzzy numbers. The main idea is to convert the arrival rates and service rates under fuzzy queues into crisp queues by using graded mean integration approach, which can be represented as median rule number. Hence, we apply the crisp values obtained to establish the performance measure of conventional multiple queueing models. This procedure has shown its effectiveness when incorporated with many types of membership functions in solving queuing problems. Two numerical illustrations are presented to determine the validity of the
... Show MoreThe textile industries play a prominent role in reviving the national economy, but they are currently suffering from several problems, including the high costs of their activities, the low quality of their production processes, and accordingly, the hexagonal diffraction approach came to help analyze production activities to determine which of them are the most expensive and do not have a benefit or cost greater than Its benefit as a result of waste and losses that accompany its implementation. And by applying to the Iraqi mechanical carpet factory, the research reached several conclusions, the most important of which is the presence of several sources of waste and loss, such as activities and operations that do not add value, whi
... Show MoreIn this study, we investigate the behavior of the estimated spectral density function of stationary time series in the case of missing values, which are generated by the second order Autoregressive (AR (2)) model, when the error term for the AR(2) model has many of continuous distributions. The Classical and Lomb periodograms used to study the behavior of the estimated spectral density function by using the simulation.