Energy savings are very common in IoT sensor networks because IoT sensor nodes operate with their own limited battery. The data transmission in the IoT sensor nodes is very costly and consume much of the energy while the energy usage for data processing is considerably lower. There are several energy-saving strategies and principles, mainly dedicated to reducing the transmission of data. Therefore, with minimizing data transfers in IoT sensor networks, can conserve a considerable amount of energy. In this research, a Compression-Based Data Reduction (CBDR) technique was suggested which works in the level of IoT sensor nodes. The CBDR includes two stages of compression, a lossy SAX Quantization stage which reduces the dynamic range of the sensor data readings, after which a lossless LZW compression to compress the loss quantization output. Quantizing the sensor node data readings down to the alphabet size of SAX results in lowering, to the advantage of the best compression sizes, which contributes to greater compression from the LZW end of things. Also, another improvement was suggested to the CBDR technique which is to add a Dynamic Transmission (DT-CBDR) to decrease both the total number of data sent to the gateway and the processing required. OMNeT++ simulator along with real sensory data gathered at Intel Lab is used to show the performance of the proposed technique. The simulation experiments illustrate that the proposed CBDR technique provides better performance than the other techniques in the literature.
The method of predicting the electricity load of a home using deep learning techniques is called intelligent home load prediction based on deep convolutional neural networks. This method uses convolutional neural networks to analyze data from various sources such as weather, time of day, and other factors to accurately predict the electricity load of a home. The purpose of this method is to help optimize energy usage and reduce energy costs. The article proposes a deep learning-based approach for nonpermanent residential electrical ener-gy load forecasting that employs temporal convolutional networks (TCN) to model historic load collection with timeseries traits and to study notably dynamic patterns of variants amongst attribute par
... Show MoreThe research aims to demonstrate the dual use of analysis to predict financial failure according to the Altman model and stress tests to achieve integration in banking risk management. On the bank’s ability to withstand crises, especially in light of its low rating according to the Altman model, and the possibility of its failure in the future, thus proving or denying the research hypothesis, the research reached a set of conclusions, the most important of which (the bank, according to the Altman model, is threatened with failure in the near future, as it is located within the red zone according to the model’s description, and will incur losses if it is exposed to crises in the future according to the analysis of stress tests
... Show MoreThis research laid the recruitment & use of IT Animatic digital TV drama & inviting ways of working is simple &effective in the level of producing images especially in TV drama, so the entry of this technology has created the concepts of new technology, and simple & effective at the same time with high level of modernity & technical innovation, this research deals with three chapters that includes two topics the animatic modes & the stages of production , & what is the atheistic aspects that is provided by this technology with the researcher reviewing the use of some TV dramatic Production series that was mentioned through this research.Which afterwards the researcher has analyzed a sample that was selected fo
... Show MoreThis research has come out with that, function-based responsibility accounting system has harmful side – effects preventing it of achieving its controlling objective, that is, goal congruence, which are due to its un integrated measures, its focus on measuring measurable behaviors while neglecting behaviors that are hardly measured, and its dependence on standard operating procedures.
In addition, the system hypotheses and measures are designed to fit previous business environment, not the current environment.
The research has also concluded that the suggestive model, that is, activity-based responsibility accounting is designed to get ride of harmful side – effects of functi
... Show MoreIn this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreBased on the diazotization-coupling reaction, a new, simple, and sensitive spectrophotometric method for determining of a trace amount of (BPF) is presented in this paper. Diazotized metoclopramide reagent react with bisphenol F produces an orange azo-compound with a maximum absorbance at 461 nm in alkaline solution. The experimental parameters were optimized such as type of alkaline medium, concentration of NaOH, diazotized metoclopramide amount, order additions, reaction time, temperature, and effect of organic solvents to achieve the optimal performance for the proposed method. The absorbance increased linearly with increasing bisphenol F concentration in the range of 0.5-10 μg mL-1 under ideal conditions, with a correlati
... Show MoreSuffer most of the industrial sector companies from high Kperfi magnitude of the costs of industrial indirect, lack of equitable distribution of these costs on the objectives of cost, increased competition, and the lack of proper planning in line and changes faced by the industrial sector (general) and sample (private), as well as the difficulty in re- directing efforts to improve profitability and in-depth analysis of activities, and to identify untapped resource activities, then link these activities to the final products The research aims to apply the technology review and evaluate programs with the method (ABC) through the application stages of planning, scheduling and control and a comparison to get to the products of dev
... Show MoreThis research basically gives an introduction about the multiple intelligence
theory and its implication into the classroom. It presents a unit plan based upon the
MI theory followed by a report which explains the application of the plan by the
researcher on the first class student of computer department in college of sciences/
University of Al-Mustansiryia and the teacher's and the students' reaction to it.
The research starts with a short introduction about the MI theory is a great
theory that could help students to learn better in a relaxed learning situation. It is
presented by Howard Gardener first when he published his book "Frames of
Minds" in 1983 in which he describes how the brain has multiple intelligen
It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show More