The intelligent buildings provided various incentives to get highly inefficient energy-saving caused by the non-stationary building environments. In the presence of such dynamic excitation with higher levels of nonlinearity and coupling effect of temperature and humidity, the HVAC system transitions from underdamped to overdamped indoor conditions. This led to the promotion of highly inefficient energy use and fluctuating indoor thermal comfort. To address these concerns, this study develops a novel framework based on deep clustering of lagrangian trajectories for multi-task learning (DCLTML) and adding a pre-cooling coil in the air handling unit (AHU) to alleviate a coupling issue. The proposed DCLTML exhibits great overall control and is suitable for multi-objective optimisation based on cooperative multi-agent systems (CMAS). The framework of DCLTML is used greedy iterative training to get an optimal set of weights and tabulated as a layer for each clustering structure. Such layers can deal with the challenges of large space and its massive data. Then the layer weights of each cluster are tuned by the Quasi-Newton (QN) algorithm to make the action sequence of CMAS optimal. Such a policy of CMAS effectively manipulates the inputs of the AHU, where the agents of the AHU activate the natural ventilation and set chillers into an idle state when the outdoor temperature crosses the recommended value. So, it is reasonable to assess the impact potential of thermal mass and hybrid ventilation strategy in reducing cooling energy; accordingly, the assigning results of the proposed DCLTML show that its main cooling coil saves >40% compared to the conventional benchmarks. Besides significant energy savings and improving environmental comfort, the DCLTML exhibits superior high-speed response and robustness performance and eliminates fatigue and wear due to shuttering valves. The results show that the DCLTML algorithm is a promising new approach for controlling HVAC systems. It is more robust to environmental variations than traditional controllers, and it can learn to control the HVAC system in a way that minimises energy consumption. The DCLTML algorithm is still under development, but it can potentially revolutionise how HVAC systems are controlled.
In this work, the surface of the telescope’s mirror is cleaned using an atmospheric-pressure radio frequency plasma jet (APRFPJ), which is generated by Argon gas between two coaxial metal electrodes. The RF power supply is set to 2 MHz frequencies with three different power levels: 20, 50, and 80 W. Carbon, that has adhered to the surface, can be effectively removed using the plasma cleaning technique, which also modifies any residual bonds. The cleaned surface was clearly distinguished using an optical emission spectroscopy (OES) technique and a water contact angle (WCA) analyzer for the activation property on their surfaces. The sample showed a super hydrophilic surface at an angle of 1° after 2.5 minutes of plasma tre
... Show MoreDigital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreBreast cancer is a heterogeneous disease characterized by molecular complexity. This research utilized three genetic expression profiles—gene expression, deoxyribonucleic acid (DNA) methylation, and micro ribonucleic acid (miRNA) expression—to deepen the understanding of breast cancer biology and contribute to the development of a reliable survival rate prediction model. During the preprocessing phase, principal component analysis (PCA) was applied to reduce the dimensionality of each dataset before computing consensus features across the three omics datasets. By integrating these datasets with the consensus features, the model's ability to uncover deep connections within the data was significantly improved. The proposed multimodal deep
... Show MoreIn this paper, we propose a new and efficient ferroelectric nanostructure metal oxide lithium niobate [(Li1.075Nb0.625Ti0.45O3), (LNTO)] solid film as a saturable absorber (SA) for modulating passive Q-switched erbium-doped fiber laser (EDFL). The SA is fabricated as a nanocomposite solid film by the drop-casting process in which the LNTO is planted within polyvinylidene fluoride-trifluoroethylene [P(VDF-TrFE)] as host copolymer. The optical and physical characteristics of the solid film are experimentally established. The SA is incorporated within the cavity of EDFL to examine its capability for producing multi-wavelength laser. The experimental results proved that a multi-wavelength laser is produced, where stable four lines with central
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show More