An impressed current cathodic protection system (ICCP) requires measurements of extremely low-level quantities of its electrical characteristics. The current experimental work utilized the Adafruit INA219 sensor module for acquiring the values for voltage, current, and power of a default load, which consumes quite low power and simulates an ICCP system. The main problem is the adaptation of the INA219 sensor to the LabVIEW environment due to the absence of the library of this sensor. This work is devoted to the adaptation of the Adafruit INA219 sensor module in the LabVIEW environment through creating, developing, and successfully testing a Sub VI to be ready for employment in an ICCP system. The sensor output was monitored with an Ardui
... Show MoreEnergy efficiency is a significant aspect in designing robust routing protocols for wireless sensor networks (WSNs). A reliable routing protocol has to be energy efficient and adaptive to the network size. To achieve high energy conservation and data aggregation, there are two major techniques, clusters and chains. In clustering technique, sensor networks are often divided into non-overlapping subsets called clusters. In chain technique, sensor nodes will be connected with the closest two neighbors, starting with the farthest node from the base station till the closest node to the base station. Each technique has its own advantages and disadvantages which motivate some researchers to come up with a hybrid routing algorit
... Show MoreIn general, the importance of cluster analysis is that one can evaluate elements by clustering multiple homogeneous data; the main objective of this analysis is to collect the elements of a single, homogeneous group into different divisions, depending on many variables. This method of analysis is used to reduce data, generate hypotheses and test them, as well as predict and match models. The research aims to evaluate the fuzzy cluster analysis, which is a special case of cluster analysis, as well as to compare the two methods—classical and fuzzy cluster analysis. The research topic has been allocated to the government and private hospitals. The sampling for this research was comprised of 288 patients being treated in 10 hospitals. As t
... Show MoreAbstract
The current research aims to identify the analysis of the questions for the book of literary criticism for the preparatory stage according to Bloom's classification. The research community consists of (34) exercises and (45) questions. The researcher used the method of analyzing questions and prepared a preliminary list that includes criteria that are supposed to measure exercises, which were selected based on Bloom's classification and the extant literature related to the topic. The scales were exposed to a jury of experts and specialists in curricula and methods of teaching the Arabic language. The scales obtained a complete agreement. Thus, it was adapted to become a reliable instrument in this
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreDensely deployment of sensors is generally employed in wireless sensor networks (WSNs) to ensure energy-efficient covering of a target area. Many sensors scheduling techniques have been recently proposed for designing such energy-efficient WSNs. Sensors scheduling has been modeled, in the literature, as a generalization of minimum set covering problem (MSCP) problem. MSCP is a well-known NP-hard optimization problem used to model a large range of problems arising from scheduling, manufacturing, service planning, information retrieval, etc. In this paper, the MSCP is modeled to design an energy-efficient wireless sensor networks (WSNs) that can reliably cover a target area. Unlike other attempts in the literature, which consider only a si
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show More