Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the applying sigmoid fish swarm optimization (SiFSO) for early compromised device detection and subsequently alerting other network nodes. Additionally, our data center implements an innovative ant skyscape architecture (ASA) cooling mechanism, departing from traditional, unsustainable cooling strategies that harm the environment. To validate the effectiveness of these approaches, extensive simulations were conducted. The evaluations primarily revolved around the fish colony’s ability to detect compromised devices, focusing on source tracing, realistic modelling, and an impressive 98% detection accuracy rate under ASA cooling solution with 0.16 ºC within 1,300 second. Compromised devices pose a substantial risk to green data centers, as attackers could manipulate and disrupt network equipment. Therefore, incorporating cyber enhancements into the green data center concept is imperative to foster more adaptable and efficient smart networks.
Abstract The aim of this research is to show the grade of implementation of ISO 26000 (Social Responsibility Standard), specifically which related in clause sex (consumer issues), this study was achieved in Market Research and Consumer Protection Center (MRCPC) / University of Baghdad. The seven consumer issues of ISO 26000 was analyzed to show the extent of its implementation in MRCPC depending of using a check list as a principle instrument to collect research data and information. Results analysis was achieved by percentages and mean average. The research was leaded some of results and the most importance one was that the grade of implementation of the center in related to consumer issues given in the standard was medium
Shadow removal is crucial for robot and machine vision as the accuracy of object detection is greatly influenced by the uncertainty and ambiguity of the visual scene. In this paper, we introduce a new algorithm for shadow detection and removal based on different shapes, orientations, and spatial extents of Gaussian equations. Here, the contrast information of the visual scene is utilized for shadow detection and removal through five consecutive processing stages. In the first stage, contrast filtering is performed to obtain the contrast information of the image. The second stage involves a normalization process that suppresses noise and generates a balanced intensity at a specific position compared to the neighboring intensit
... Show MoreThis Study· is to investigate the Visceral LeishmaniEJ,sis among 8
susp<:;cted infants and young chUdren admitted 1o AI-Khadhtmiya
Pediatric Hospital in Baghdad between January 1,2005 o August
31,2005 .For each patieni, a medical history was obtained and a complete physical examination was performed by physicians . The serums of the .suspected c_es were primarily diagnosed by using a new ,simple diagnostic method bas.ed on detecting antib9dies against the recombinant K 39
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreInformation hiding strategies have recently gained popularity in a variety of fields. Digital audio, video, and images are increasingly being labelled with distinct but undetectable marks that may contain a hidden copyright notice or serial number, or even directly help to prevent unauthorized duplication. This approach is extended to medical images by hiding secret information in them using the structure of a different file format. The hidden information may be related to the patient. In this paper, a method for hiding secret information in DICOM images is proposed based on Discrete Wavelet Transform (DWT). Firstly. segmented all slices of a 3D-image into a specific block size and collecting the host image depend on a generated key
... Show MoreIn this study, the mobile phone traces concern an ephemeral event which represents important densities of people. This research aims to study city pulse and human mobility evolution that would be arise during specific event (Armada festival), by modelling and simulating human mobility of the observed region, depending on CDRs (Call Detail Records) data. The most pivot questions of this research are: Why human mobility studied? What are the human life patterns in the observed region inside Rouen city during Armada festival? How life patterns and individuals' mobility could be extracted for this region from mobile DB (CDRs)? The radius of gyration parameter has been applied to elaborate human life patterns with regards to (work, off) days for
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreA loS.sless (reversible) data hiding (embedding) method inside an image (translating medium) - presented in the present work using L_SB (least significant bit). technique which enables us to translate data using an image (host image), using a secret key, to be undetectable without losing any data or without changing the size and the external scene (visible properties) of the image, the hid-ing data is then can be extracted (without losing) by reversing &n
... Show More