The intelligent buildings provided various incentives to get highly inefficient energy-saving caused by the non-stationary building environments. In the presence of such dynamic excitation with higher levels of nonlinearity and coupling effect of temperature and humidity, the HVAC system transitions from underdamped to overdamped indoor conditions. This led to the promotion of highly inefficient energy use and fluctuating indoor thermal comfort. To address these concerns, this study develops a novel framework based on deep clustering of lagrangian trajectories for multi-task learning (DCLTML) and adding a pre-cooling coil in the air handling unit (AHU) to alleviate a coupling issue. The proposed DCLTML exhibits great overall control and is suitable for multi-objective optimisation based on cooperative multi-agent systems (CMAS). The framework of DCLTML is used greedy iterative training to get an optimal set of weights and tabulated as a layer for each clustering structure. Such layers can deal with the challenges of large space and its massive data. Then the layer weights of each cluster are tuned by the Quasi-Newton (QN) algorithm to make the action sequence of CMAS optimal. Such a policy of CMAS effectively manipulates the inputs of the AHU, where the agents of the AHU activate the natural ventilation and set chillers into an idle state when the outdoor temperature crosses the recommended value. So, it is reasonable to assess the impact potential of thermal mass and hybrid ventilation strategy in reducing cooling energy; accordingly, the assigning results of the proposed DCLTML show that its main cooling coil saves >40% compared to the conventional benchmarks. Besides significant energy savings and improving environmental comfort, the DCLTML exhibits superior high-speed response and robustness performance and eliminates fatigue and wear due to shuttering valves. The results show that the DCLTML algorithm is a promising new approach for controlling HVAC systems. It is more robust to environmental variations than traditional controllers, and it can learn to control the HVAC system in a way that minimises energy consumption. The DCLTML algorithm is still under development, but it can potentially revolutionise how HVAC systems are controlled.
Wireless sensor networks (WSNs) are emerging in various application like military, area monitoring, health monitoring, industry monitoring and many more. The challenges of the successful WSN application are the energy consumption problem. since the small, portable batteries integrated into the sensor chips cannot be re-charged easily from an economical point of view. This work focusses on prolonging the network lifetime of WSNs by reducing and balancing energy consumption during routing process from hop number point of view. In this paper, performance simulation was done between two types of protocols LEACH that uses single hop path and MODLEACH that uses multi hop path by using Intel Care i3 CPU (2.13GHz) laptop with MATLAB (R2014a). Th
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreHuman beings are greatly inspired by nature. Nature has the ability to solve very complex problems in its own distinctive way. The problems around us are becoming more and more complex in the real time and at the same instance our mother nature is guiding us to solve these natural problems. Nature gives some of the logical and effective ways to find solutions to these problems. Nature acts as an optimized source for solving the complex problems. Decomposition is a basic strategy in traditional multi-objective optimization. However, it has not yet been widely used in multi-objective evolutionary optimization.
Although computational strategies for taking care of Multi-objective Optimization Problems (MOPs) h
... Show MoreThis paper describes a research effort that aims of developing solar models for housing suitable for the Arabian region since the Arabian Peninsula is excelled with very high levels of solar radiation.
The current paper is focused on achieving energy efficiency through utilizing solar energy and conserving energy. This task can be accomplished by implementation the major elements related to energy efficiency in housing design , such as embark on an optimum photovoltaic system orientation to maximize seize solar energy and produce solar electricity. All the precautions were taken to minimizing the consumption of solar energy for providing the suitable air-condition to the inhibitor of the solar house in addition to use of energy effici
Global services with an agent or a multi-agent system are a promising and new research area. However, several measures have been proposed to demonstrate the benefits of agent technology by supporting distributed services and applying smart agent technology in web dynamics. This paper is designed to build a Semantic Web on the World Wide Web (WWW) to enhance the productivity of managing electronic library applications, which poses a problem to researchers and students, represnted by the process of exchanging books from e-libraries, where the process is slow or the library needs large system data.
The conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc
... Show More