This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB), and Support Vector Machine (SVM) techniques. CART gives clear results with high accuracy between the six supervised algorithms. It is worth noting that the preprocessing steps take remarkable efforts to handle this type of data, since its pure data set has so many null values of a ratio 94.8%, then it becomes 0% after achieving the preprocessing steps. Then, in order to apply CART algorithm, several determined tests were assumed as classes. The decision to select the tests which had been assumed as classes were depending on their acquired accuracy. Consequently, enabling the physicians to trace and connect the tests result with each other, which extends its impact on patients’ health.
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreAbstract
Performance evaluation is of great importance in all countries of the world, because it has a prominent and effective role in determining the efficiency and effectiveness of the optimal use of available resources, which are rare and important in achieving the desired objectives. With the continued growth of public spending and the limited resources, the State seeks to achieve its objectives through its units with minimal expenditure or deficit, rationality and wastefulness in the spending. In many countries, particularly developing countries, reforms are made in the public sector to achieve that goal through the adoption of IPSAS, which is reflected in the developmen
... Show MoreThe research aims to build a list of digital citizenship axes and standards and indicators emanating from them, which should be included in the content of the computer textbook scheduled for second grade intermediate students in Iraq, and the analysis of the above mentioned book according to the same list using the descriptive analytical method ((method of content analysis)). The research community and its sample consisted of the content of the computer textbook scheduled for the second year intermediate students for the academic year 2018-2019, and the research tool was built in its initial form after reference to a set of specialized literature and previous studies that dealt with topics related to digital citizenship, and the authenticit
... Show MoreGingival crevicular fluid (GCF) may reflect the events associated with orthodontic tooth movement. Attempts have been conducted to identify biomarkers reflecting optimum orthodontic force, unwanted sequallea (i.e. root resorption) and accelerated tooth movement. The aim of the present study is to find out a standardized GCF collection, storage and total protein extraction method from apparently healthy gingival sites with orthodontics that is compatible with further high-throughput proteomics. Eighteen patients who required extractions of both maxillary first premolars were recruited in this study. These teeth were randomly assigned to either heavy (225g) or light force (25g), and their site specific GCF was collected at baseline and aft
... Show MoreOne of the main causes for concern is the widespread presence of pharmaceuticals in the environment, which may be harmful to living things. They are often referred to as emerging chemical pollutants in water bodies because they are either still unregulated or undergoing regulation. Pharmaceutical pollution of the environment may have detrimental effects on ecosystem viability, human health, and water quality. In this study, the amount of remaining pharmaceutical compounds in environmental waters was determined using a straightforward review. Pharmaceutical production and consumption have increased due to medical advancements, leading to concerns about their environmental impact and potential harm to living things due to their increa
... Show MoreThe goal of this work is demonstrating, through the gradient observation of a of type linear ( -systems), the possibility for reducing the effect of any disturbances (pollution, radiation, infection, etc.) asymptotically, by a suitable choice of related actuators of these systems. Thus, a class of ( -system) was developed based on finite time ( -system). Furthermore, definitions and some properties of this concept -system and asymptotically gradient controllable system ( -controllable) were stated and studied. More precisely, asymptotically gradient efficient actuators ensuring the weak asymptotically gradient compensation system ( -system) of known or unknown disturbances are examined. Consequently, under convenient hypo
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.