Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
This research aims to the possibility of evaluating the strategic performance of the State Board for Antiquities and Heritage (SBAH) using a balanced scorecard of four criteria (Financial, Customers, Internal Processes, and Learning and Growth). The main challenge was that the State Board use traditional evaluation in measuring employee performance, activities, and projects. Case study and field interviews methodology has been adopted in this research with a sample consisting of the Chairman of the State Board, 6 General Managers, and 7 Department Managers who are involved in evaluating the strategic performance and deciding the suitable answers on the checklists to analyze it ac
... Show MoreThe Purpose of this study is mainly to improve the competitive position of products economic units using technique target cost and method reverse engineering and through the application of technique and style on one of the public sector companies (general company for vegetable oils) which are important in the detection of prices accepted in the market for items similar products and processing the problem of high cost which attract managerial and technical leadership to the weakness that need to be improved through the introduction of new innovative solutions which make appropriate change to satisfy the needs of consumers in a cheaper way to affect the decisions of private customer to buy , especially of purchase private economic units to
... Show MoreA Multiple System Biometric System Based on ECG Data
Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreUnder aerobic and anaerobic conditions, two laboratory-scale reactors were operated. Each reactor
was packed with 8.5 kg of shredded synthetic solid waste (less than 5 cm) that was prepared according to an
average composition of domestic solid waste in the city of Kirkuk. Using an air compressor, aerobic
conditions were created in the aerobic reactor. This study shows that the aerobic reactor was more efficient in
COD and BOD5 removal which were 97.88% and 91.25% while in case of anaerobic reactor, they were
66.53%and 19.11%, respectively.
Electronic University Library: Reality and Ambition Case Study Central Library of Baghdad University
Abstract
The aim of this study was to identify the impact of the Knowledge Management Processes on organizational creativity in the Airlines Companies working in Sudan. The hypotheses formulated as:, there is a positive relationship statistically significant differences between knowledge management processes (diagnosis, the acquisition, storage, distribution and application) and organizational creativity. the measurement of the variables had been adopted from previous studies. The study used a Descriptive approach and and the analytical statistical method to construct the model and SPSS Program for data analysis .Purposive sample procedure had been chosen and structured questionnaire had been developed. Out of 215 q
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show More