Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The need to constantly and consistently improve the quality and quantity of the educational system is essential. E-learning has emerged from the rapid cycle of change and the expansion of new technologies. Advances in information technology have increased network bandwidth, data access speed, and reduced data storage costs. In recent years, the implementation of cloud computing in educational settings has garnered the interest of major companies, leading to substantial investments in this area. Cloud computing improves engineering education by providing an environment that can be accessed from anywhere and allowing access to educational resources on demand. Cloud computing is a term used to describe the provision of hosting services
... Show MoreThe rapid and enormous growth of the Internet of Things, as well as its widespread adoption, has resulted in the production of massive quantities of data that must be processed and sent to the cloud, but the delay in processing the data and the time it takes to send it to the cloud has resulted in the emergence of fog, a new generation of cloud in which the fog serves as an extension of cloud services at the edge of the network, reducing latency and traffic. The distribution of computational resources to minimize makespan and running costs is one of the disadvantages of fog computing. This paper provides a new approach for improving the task scheduling problem in a Cloud-Fog environme
|
The speech delivered by political blocs and parties and broadcasted by satellite channels, social and communication media has different ideologies and orientations: moderate speech calling for calm or one raising crises.The latter is considered very challenging due to its local and international reference., this paper aims at uncovering these challenges especially during the political crisis witnessed in Iraq. This paper sheds light on the most important crisis that spread in public opinion, broadcasted by satellite, and raised by politicians who are competing to gain authority leading to a lack of peoples, confidence in them.This matter should not be neglected at all; e |
The paper deals with claims in construction projects in Iraq and studies their types, causes, impacts, resolution methods and then proposes a management system to control the impacts of claims. Two parts have been done to achieve the research objective (theoretical part and practical part). The findings showed that the main types of the claims are extra work claims, different site condition claims, delay claims and the main causes of the claims are variation of the orders, design errors and omission, delay in payments by owner, variation in quantities and scheduling errors. The claims have bad impacts on the cost by increasing (10% to 25%) and also on the duration of the project by increasing from (25% to 50%).The negotiation is the main
... Show Moreالحمدُ للهِ رب العالمين ، والصلاة والسلام على نبيه الأمين محمد r وعلى آله الطيبين الطاهرين ، وأصحابه الغر الميامين:
تعد الصورة السمعية مفهوما بيانيا نجده في البلاغة العربية واضحاً مؤثرا، مؤديا دورا جوهريا في إيصال الفكرة التي يروم الأديب إيصالها إلى المتلقي ولا تبدو السمعية واضحة إلاّ إذا نظر إليها في حالة أدبيه تهز كيان الشاعر  
... Show MoreDeep Learning Techniques For Skull Stripping of Brain MR Images
Contours extraction from two dimensional echocardiographic images has been a challenge in digital image processing. This is essentially due to the heavy noise, poor quality of these images and some artifacts like papillary muscles, intra-cavity structures as chordate, and valves that can interfere with the endocardial border tracking. In this paper, we will present a technique to extract the contours of heart boundaries from a sequence of echocardiographic images, where it started with pre-processing to reduce noise and produce better image quality. By pre-processing the images, the unclear edges are avoided, and we can get an accurate detection of both heart boundary and movement of heart valves.
Today’s modern medical imaging research faces the challenge of detecting brain tumor through Magnetic Resonance Images (MRI). Normally, to produce images of soft tissue of human body, MRI images are used by experts. It is used for analysis of human organs to replace surgery. For brain tumor detection, image segmentation is required. For this purpose, the brain is partitioned into two distinct regions. This is considered to be one of the most important but difficult part of the process of detecting brain tumor. Hence, it is highly necessary that segmentation of the MRI images must be done accurately before asking the computer to do the exact diagnosis. Earlier, a variety of algorithms were developed for segmentation of MRI images by usin
... Show MoreRationing is a commonly used solution for shortages of resources and goods that are vital for the citizens of a country. This paper identifies some common approaches and policies used in rationing as well asrisks that associated to suggesta system for rationing fuelwhichcan work efficiently. Subsequently, addressing all possible security risks and their solutions. The system should theoretically be applicable in emergency situations, requiring less than three months to implement at a low cost and minimal changes to infrastructure.
This research aims to review the importance of estimating the nonparametric regression function using so-called Canonical Kernel which depends on re-scale the smoothing parameter, which has a large and important role in Kernel and give the sound amount of smoothing .
We has been shown the importance of this method through the application of these concepts on real data refer to international exchange rates to the U.S. dollar against the Japanese yen for the period from January 2007 to March 2010. The results demonstrated preference the nonparametric estimator with Gaussian on the other nonparametric and parametric regression estima
... Show More