Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Cloud computing offers a new way of service provision by rearranging various resources over the Internet. The most important and popular cloud service is data storage. In order to preserve the privacy of data holders, data are often stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for big data storage and processing in the cloud. Traditional deduplication schemes cannot work on encrypted data. Among these data, digital videos are fairly huge in terms of storage cost and size; and techniques that can help the legal aspects of video owner such as copyright protection and reducing the cloud storage cost and size are always desired. This paper focuses on v
... Show MoreImage compression is very important in reducing the costs of data storage transmission in relatively slow channels. Wavelet transform has received significant attention because their multiresolution decomposition that allows efficient image analysis. This paper attempts to give an understanding of the wavelet transform using two more popular examples for wavelet transform, Haar and Daubechies techniques, and make compression between their effects on the image compression.
This work studies with produce of light fuel fractions of gasoline, kerosene and gas oil from treatment of residual matter that will be obtained from the solvent extraction process as by product from refined lubricate to improve oil viscosity index in any petroleum refinery. The percentage of this byproduct is approximately 10% according to all feed (crude oil) in the petroleum refinery process. The objective of this research is to study the effect of the residence time parameter on the thermal cracking process of the byproduct feed at a constant temperature, (400 °C). The first step of this treatment is the thermal cracking of this byproduct material by a constructed batch reactor occupied with control device at a selective range of re
... Show MoreThe research aimed at designing a rehabilitation program using electric stimulation for rehabilitating knee joint working muscles as a result of ACL tear using an apparatus developed by the researchers that stimulate the muscle vibration and work as well as the ability to rehabilitate the join in shorter periods. In addition to that, it aimed at identifying the effect of this program on rehabilitating the knee joint working muscles. The researchers used the experimental method on Baghdad clubs’ players who suffer from complete knee joint ACL tear aged (19 – 24) years old. The results showed that the training program developed the working muscles significantly achieving normal levels of activity.
The aim of this research is to apply the concept of total value management to improve the process design of producing the toothpaste in Al Mammon factory one of the in the general company of food industry. The concept of total value management is concerning with achieve more than one values which are important for the customers as these values are related to the customers satisfaction. The research problem is that the factory did not measure the effectiveness of process design as this company has weakness in analyzing this effectiveness in synchronies with total value management. On the other side, the company did not give more attention to the cost of products and selling prices within the value cost/ profit which is one of the
... Show MoreThis study was carried out to measure the percentage of heavy metals pollution in the water of the Diyala river and to measure the percentage of contamination of these elements in the leafy vegetables grown on both sides of the Diyala river, which are irrigated by the contaminated river water (celery, radish, lepidium, green onions, beta vulgaris subsp, and malva). Laboratory analysis was achieved to measure the ratio of heavy element contamination (Pb, Fe, Ni, Cd, Zn and Cr) using flame atomic absorption spectrophotometer during the summer months of July and August for the year 2017. The study showed that the elements of zinc, chromium, nickel and cadmium were high concentrations and exceeded. The maximum concentration of these
... Show MoreThe present study aimed to identify the therapeutic evaluation of chitosan extracted from the fungus cushroom and pure chitosan on glucose and lipid profile in the blood of 35 male rabbits with hyperlipidemia induced experimentally by cholesterol. The tests included estimation of glucose levels, total cholesterol, triglycerides, high-density lipoproteins, low-density lipoproteins, and very low-density lipoproteins. hyperlipidemia was induced in the male rabbits used in the study which was administered orally with cholesterol 150mg/kg body weight for a week. rabbits were divided into seven groups: control, cholesterol, pure chitosan, mushroom chitosan, cholesterol and pure chitosan, cholesterol and mushroom chitosan and cholestero
... Show MoreAlpha shape theory for 3D visualization and volumetric measurement of brain tumor progression using magnetic resonance images
The objective of this paper is to improve the general quality of infrared images by proposes an algorithm relying upon strategy for infrared images (IR) enhancement. This algorithm was based on two methods: adaptive histogram equalization (AHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE). The contribution of this paper is on how well contrast enhancement improvement procedures proposed for infrared images, and to propose a strategy that may be most appropriate for consolidation into commercial infrared imaging applications.
The database for this paper consists of night vision infrared images were taken by Zenmuse camera (FLIR Systems, Inc) attached on MATRIC100 drone in Karbala city. The experimental tests showed sign
This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show More