Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Activity recognition (AR) is a new interesting and challenging research area with many applications (e.g. healthcare, security, and event detection). Basically, activity recognition (e.g. identifying user’s physical activity) is more likely to be considered as a classification problem. In this paper, a combination of 7 classification methods is employed and experimented on accelerometer data collected via smartphones, and compared for best performance. The dataset is collected from 59 individuals who performed 6 different activities (i.e. walk, jog, sit, stand, upstairs, and downstairs). The total number of dataset instances is 5418 with 46 labeled features. The results show that the proposed method of ensemble boost-based classif
... Show MoreThis research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained
... Show MoreThere are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case com
... Show MoreThe ligand Schiff base [(E)-3-(2-hydroxy-5-methylbenzylideneamino)- 1- phenyl-1H-pyrazol-5(4H) –one] with some metals ion as Mn(II); Co(II); Ni(II); Cu(II); Cd(II) and Hg(II) complexes have been preparation and characterized on the basic of mass spectrum for L, elemental analyses, FTIR, electronic spectral, magnetic susceptibility, molar conductivity measurement and functions thermodynamic data study (∆H°, ∆S° and ∆G°). Results of conductivity indicated that all complexes were non electrolytes. Spectroscopy and other analytical studies reveal distorted octahedral geometry for all complexes. The antibacterial activity of the ligand and preparers metal complexes was also studied against gram and negative bacteria.
Information pollution is regarded as a big problem facing journalists working in the editing section, whereby journalistic materials face such pollution through their way across the editing pyramid. This research is an attempt to define the concept of journalistic information pollution, and what are the causes and sources of this pollution. The research applied the descriptive research method to achieve its objectives. A questionnaire was used to collect data. The findings indicate that journalists are aware of the existence of information pollution in journalism, and this pollution has its causes and resources.
Breast cancer is the second deadliest disease infected women worldwide. For this
reason the early detection is one of the most essential stop to overcomeit dependingon
automatic devices like artificial intelligent. Medical applications of machine learning
algorithmsare mostly based on their ability to handle classification problems,
including classifications of illnesses or to estimate prognosis. Before machine
learningis applied for diagnosis, it must be trained first. The research methodology
which isdetermines differentofmachine learning algorithms,such as Random tree,
ID3, CART, SMO, C4.5 and Naive Bayesto finds the best training algorithm result.
The contribution of this research is test the data set with mis
With the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data.
... Show MoreIraq is facing water shortage problems due to various factors, globally ( Global warming) and regionally ( GAP project) and locally ( improper water resources management projects). In this search the global warming influence on the annual mean value of temperature and yet on the annual mean value of the evapotranspiration for more than three decades has been studied. The climate of Iraq is influenced by its location between the subtropical aridity of the Arabian desert areas and the subtropical humidity of the Arabian Gulf. The relative ascension of temperature degrees in the recent decades was the main factor in relative humidity decrement which increase the evapotranspiration values, since that utilizing a temperature-based method as i
... Show MoreThe analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
In this paper, membrane-based computing image segmentation, both region-based and edge-based, is proposed for medical images that involve two types of neighborhood relations between pixels. These neighborhood relations—namely, 4-adjacency and 8-adjacency of a membrane computing approach—construct a family of tissue-like P systems for segmenting actual 2D medical images in a constant number of steps; the two types of adjacency were compared using different hardware platforms. The process involves the generation of membrane-based segmentation rules for 2D medical images. The rules are written in the P-Lingua format and appended to the input image for visualization. The findings show that the neighborhood relations between pixels o
... Show More