The Environmental Data Acquisition Telemetry System is a versatile, flexible and economical means to accumulate data from multiple sensors at remote locations over an extended period of time; the data is normally transferred to the final destination and saved for further analysis.
This paper introduces the design and implementation of a simplified, economical and practical telemetry system to collect and transfer the environmental parameters (humidity, temperature, pressure etc.) from a remote location (Rural Area) to the processing and displaying unit.
To get a flexible and practical system, three data transfer methods (three systems) were proposed (including the design and implementation) for rural area services, the fi
... Show MoreSCADA is the technology that allows the operator to gather data from one or more various facilities and to send control instructions to those facilities. This paper represents an adaptable and low cost SCADA system for a particular sugar manufacturing process, by using Programmable Logic Controls (Siemens s7-1200, 1214Dc/ Dc/ Rly). The system will control and monitor the laboratory production line chose from sugar industry. The project comprises of two sections the first one is the hardware section that has been designed, and built using components suitable for making it for laboratory purposes, and the second section was the software as the PLC programming, designing the HMI, creating alarms and trending system. The system will ha
... Show MoreHaruki Murakami (1949-present) is a contemporary Japanese writer whose works have been translated into fifty languages and won him plenty of Japanese and international awards. His short stories are well constructed in a weird realistic manner and are mixed with elements of surrealism. His novels and short stories fall under the genre of magical realism. One of the major revolving themes that Murakami wrote about was the haunting feeling of emptiness and disconnectedness in a world which seems to care much for materialism and self-interests.
The paper explores two of Murakami’s short stories in his book After the Quake (2000) and the relevance of their themes and characters to Iraq after the q
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreLand Use / Land Cover (LULC) classification is considered one of the basic tasks that decision makers and map makers rely on to evaluate the infrastructure, using different types of satellite data, despite the large spectral difference or overlap in the spectra in the same land cover in addition to the problem of aberration and the degree of inclination of the images that may be negatively affect rating performance. The main objective of this study is to develop a working method for classifying the land cover using high-resolution satellite images using object based method. Maximum likelihood pixel based supervised as well as object approaches were examined on QuickBird satellite image in Karbala, Iraq. This study illustrated that
... Show MoreData of multispectral satellite image (Landsat- 5 and Landsat-7) was used to monitoring the case of study area in the agricultural (extension and plant density), using ArcGIS program by the method of analysis (Soil adjusted vegetative Index). The data covers the selected area at west of Baghdad Government with a part of the Anbar and Karbala Government. Satellite image taken during the years 1990, 2001 and 2007. The scene of Satellite Image is consists of seven of spectral band for each satellite, Landsat-5(TM) thematic mapper for the year 1990, as well as satellite Landsat-7 (ETM+) Enhancement thematic mapper for the year 2001 and 2007. The results showed that in the period from 1990 to 2001 decreased land area exposed (bare) and increased
... Show MoreEsterification reaction is most important reaction in biodiesel production. In this study, oleic acid was used as a suggested feedstock to study and simulate production of biodiesel. Batch esterification of oleic acid was carried out at operating conditions; temperature from 40 to 70 °C, ethanol to oleic acid molar ratio from 1/1 to 6/1, H2SO4 as the catalyst 1 and 5% wt of oleic acid, reaction time up to 180 min. The optimum conditions for the esterification reaction were molar ratio of ethanol/oleic acid 6/1, 5%wt H2SO4 relative to oleic acid, 70 °C, 90 min and conversion of oleic 0.92. The activation energy for the suggested model was 26625 J/mole for forward reaction and 42189 J/mole for equilibrium constant. The obtained results s
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More