In this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
The complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreUrinary stones are one of the most common painful disorders of the urinary system. Four new technologies have transformed the treatment of urinary stones: Electrohydraulic lithotripsy, ultrasonic lithotripsy, extracorporeal shock wave lithotripsy, and laser lithotripsy.The purpose of this study is to determine whether pulsed holmium laser energy is an effective method for fragmenting urinary tract stones in vitro, and to determine whether stone composition affects the efficacy of holmium laser lithotripsy. Human urinary stones of known composition with different sizes, shapes and colors were used for this study. The weight and the size of each stone were measured. The surgical laser system which used in our study is Ho:YAG laser(2100nm)
... Show MoreThe important device in the Wireless Sensor Network (WSN) is the Sink Node (SN). That is used to store, collect and analyze data from every sensor node in the network. Thus the main role of SN in WSN makes it a big target for traffic analysis attack. Therefore, securing the SN position is a substantial issue. This study presents Security for Mobile Sink Node location using Dynamic Routing Protocol called (SMSNDRP), in order to increase complexity for adversary trying to discover mobile SN location. In addition to that, it minimizes network energy consumption. The proposed protocol which is applied on WSN framework consists of 50 nodes with static and mobile SN. The results havw shown in each round a dynamic change in the route to reach mobi
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreBoth the double-differenced and zero-differenced GNSS positioning strategies have been widely used by the geodesists for different geodetic applications which are demanded for reliable and precise positions. A closer inspection of the requirements of these two GNSS positioning techniques, the zero-differenced positioning, which is known as Precise Point Positioning (PPP), has gained a special importance due to three main reasons. Firstly, the effective applications of PPP for geodetic purposes and precise applications depend entirely on the availability of the precise satellite products which consist of precise satellite orbital elements, precise satellite clock corrections, and Earth orientation parameters. Secondly, th
... Show MoreProsthetic is an artificial tool that replaces a member of the human frame that is absent because of ailment, damage, or distortion. The current research activities in Iraq draw interest to the upper limb discipline because of the growth in the number of amputees. Thus, it becomes necessary to increase researches in this subject to help in reducing the struggling patients. This paper describes the design and development of a prosthesis for people able and wear them from persons who have amputation in the hands. This design is composed of a hand with five fingers moving by means of a gearbox ism mechanism. The design of this artificial hand has 5 degrees of freedom. This artificial hand works based on the principle of &n
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More