Energy savings are very common in IoT sensor networks because IoT sensor nodes operate with their own limited battery. The data transmission in the IoT sensor nodes is very costly and consume much of the energy while the energy usage for data processing is considerably lower. There are several energy-saving strategies and principles, mainly dedicated to reducing the transmission of data. Therefore, with minimizing data transfers in IoT sensor networks, can conserve a considerable amount of energy. In this research, a Compression-Based Data Reduction (CBDR) technique was suggested which works in the level of IoT sensor nodes. The CBDR includes two stages of compression, a lossy SAX Quantization stage which reduces the dynamic range of the sensor data readings, after which a lossless LZW compression to compress the loss quantization output. Quantizing the sensor node data readings down to the alphabet size of SAX results in lowering, to the advantage of the best compression sizes, which contributes to greater compression from the LZW end of things. Also, another improvement was suggested to the CBDR technique which is to add a Dynamic Transmission (DT-CBDR) to decrease both the total number of data sent to the gateway and the processing required. OMNeT++ simulator along with real sensory data gathered at Intel Lab is used to show the performance of the proposed technique. The simulation experiments illustrate that the proposed CBDR technique provides better performance than the other techniques in the literature.
This study deals with the processing of field seismic data for a seismic line located within the administrative boundaries of Najaf and Muthanna governorates in southern Iraq (7Gn 21) with a length of 54 km. The study was conducted within the Processing Department of the Oil Exploration Company using the Omega system, which contains a large number of programs that deal with processing, through the use of these programs applied predictive deconvolution of both( gap) and (spike). The final section was produced for both types. The gap predictive deconvolution gave improvement in the shallow reflectors while in deep reflectors it did not give a good improvement, thus giving a good continuity of the reflectors at
... Show MoreIn the last decade, the web has rapidly become an attractive platform, and an indispensable part of our lives. Unfortunately, as our dependency on the web increases so programmers focus more on functionality and appearance than security, has resulted in the interest of attackers in exploiting serious security problems that target web applications and web-based information systems e.g. through an SQL injection attack. SQL injection in simple terms, is the process of passing SQL code into interactive web applications that employ database services such applications accept user input such as form and then include this input in database requests, typically SQL statements in a way that was not intende
... Show MoreLoanwords are the words transferred from one language to another, which become essential part of the borrowing language. The loanwords have come from the source language to the recipient language because of many reasons. Detecting these loanwords is complicated task due to that there are no standard specifications for transferring words between languages and hence low accuracy. This work tries to enhance this accuracy of detecting loanwords between Turkish and Arabic language as a case study. In this paper, the proposed system contributes to find all possible loanwords using any set of characters either alphabetically or randomly arranged. Then, it processes the distortion in the pronunciation, and solves the problem of the missing lette
... Show MoreThe reduction to pole of the aeromagnetic map of the western desert of Iraq has been used to outline the main basement structural features. Three selected magnetic anomalies are used to determine the depths of their magnetic sources. The estimated depths are obtained by using slope half slope method and have been corrected through the application of a published nomogram. These depths are compared with previous published depth values which provide a new look at the basement of the western desert in addition to the thickness map of the Paleozoic formations. The results shed light on the important of the great depths of the basement structures and in turn the sedimentary cover to be considered for future hydrocarbon exploration
Assessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,
... Show MoreSemiconductor-based metal oxide gas detector of five mixed from zinc chloride Z and tin chloride S salts Z:S ratio 0, 25, 50, 75 and 100% were fabricated on glass substrate by a spray pyrolysis technique. With thickness were about 0.2 ±0.05 μm using water soluble as precursors at a glass substrate temperature 500 ºC±5, 0.05 M, and their gas sensing properties toward CH4, LPG and H2S gas at different concentration (10, 100, 1000 ppm) in air were investigated at room temperature which related with the petroleum refining industry.
Furthermore structural and morphology properties were scrutinize. Results shows that the mixing ratio affect the composition of formative oxides were (ZnO, Zn2SnO4, Zn2SnO4+ZnSnO3, ZnSnO3, SnO2) ratios ment
Fuzzy measures are considered important tools to solve many environmental problems. Water pollution is one of the environmental problems, which has negatively effect on the health of consumers. In this paper, a mathematical model is proposed to evaluate water quality in the distribution networks of Baghdad city. Fuzzy logic and fuzzy measures have been applied to evaluate water quality with respect to chemical and microbiological contaminants. Our results are evaluate water pollution of some chemical and microbiological contaminants, which are difficult to evaluation through traditional methods.
This research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreIn this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreDue to the increased of information existing on the World Wide Web (WWW), the subject of how to extract new and useful knowledge from the log file has gained big interest among researchers in data mining and knowledge discovery topics.
Web miming, which is a subset of data mining divided into three particular ways, web content mining, web structure mining, web usage mining. This paper is interested in server log file, which is belonging to the third category (web usage mining). This file will be analyzed according to the suggested algorithm to extract the behavior of the user. Knowing the behavior is coming from knowing the complete path which is taken from the specific user.
Extracting these types of knowledge required many of KDD