Establishing complete and reliable coverage for a long time-span is a crucial issue in densely surveillance wireless sensor networks (WSNs). Many scheduling algorithms have been proposed to model the problem as a maximum disjoint set covers (DSC) problem. The goal of DSC based algorithms is to schedule sensors into several disjoint subsets. One subset is assigned to be active, whereas, all remaining subsets are set to sleep. An extension to the maximum disjoint set covers problem has also been addressed in literature to allow for more advance sensors to adjust their sensing range. The problem, then, is extended to finding maximum number of overlapped set covers. Unlike all related works which concern with the disc sensing model, the contribution of this paper is to reformulate the maximum overlapped set covers problem to handle the probabilistic sensing model. The problem is addressed as a multi-objective optimization (MOO) problem and the well-known decomposition based multi-objective evolutionary algorithm (MOEA/D) is adopted to solve the stated problem. A Multi-layer MOEA/D is suggested, wherein each layer yields a distinct set cover. Performance evaluations in terms of total number of set covers, total residual energy, and coverage reliability are reported through extensive simulations. The main aspect of the results reveals that the network's lifetime (i.e. total number of set covers) can be extended by increasing number of sensors. On the other hand, the coverage reliability can be increased by increasing sensing ranges but at the expense of decreasing the network's lifetime.
This research reports an error analysis of close-range measurements from a Stonex X300 laser scanner in order to address range uncertainty behavior based on indoor experiments under fixed environmental conditions. The analysis includes procedures for estimating the precision and accuracy of the observational errors estimated from the Stonex X300 observations and conducted at intervals of 5 m within a range of 5 to 30 m. The laser 3D point cloud data of the individual scans is analyzed following a roughness analysis prior to the implementation of a Levenberg–Marquardt iterative closest points (LM-ICP) registration. This leads to identifying the level of roughness that was encountered due to the range-finder’s limitations in close
... Show MoreThe physical sports sector in Iraq suffers from the problem of achieving sports achievements in individual and team games in various Asian and international competitions, for many reasons, including the lack of exploitation of modern, accurate and flexible technologies and means, especially in the field of information technology, especially the technology of artificial neural networks. The main goal of this study is to build an intelligent mathematical model to predict sport achievement in pole vaulting for men, the methodology of the research included the use of five variables as inputs to the neural network, which are Avarage of Speed (m/sec in Before distance 05 meters latest and Distance 05 meters latest, The maximum speed achieved in t
... Show MoreTo ensure fault tolerance and distributed management, distributed protocols are employed as one of the major architectural concepts underlying the Internet. However, inefficiency, instability and fragility could be potentially overcome with the help of the novel networking architecture called software-defined networking (SDN). The main property of this architecture is the separation of the control and data planes. To reduce congestion and thus improve latency and throughput, there must be homogeneous distribution of the traffic load over the different network paths. This paper presents a smart flow steering agent (SFSA) for data flow routing based on current network conditions. To enhance throughput and minimize latency, the SFSA distrib
... Show MoreBecause of the quick growth of electrical instruments used in noxious gas detection, the importance of gas sensors has increased. X-ray diffraction (XRD) can be used to examine the crystal phase structure of sensing materials, which affects the properties of gas sensing. This contributes to the study of the effect of electrochemical synthesis of titanium dioxide (TiO2) materials with various crystal phase shapes, such as rutile TiO2 (R-TiO2NTs) and anatase TiO2 (A-TiO2NTs). In this work, we have studied the effect of voltage on preparing TiO2 nanotube arrays via the anodization technique for gas sensor applications. The results acquired from XRD, energy dispersion spectro
... Show MoreThe study of triples seeks to deal with the comprehensive nature of the Qur’an texts, and the choice fell on the trilogy of great torment, pain, and humiliation in the Noble Qur’an - an objective study, the title of this research, in which I tried to shed light on these terms, and the nuances between them, and in particular torment The eschatological terminology varied, which can be summed up in three terms, namely the great, the painful, and the offensive. The types of torment, the pain is the painful one that is described by the severity of pain and its horror, as for the humiliating punishment, it is that which humiliates the one who has fallen on it, and the diversity of torment is due to the diversity of sins.
Transport layer is responsible for delivering data to the appropriate application process on the host computers. The two most popular transport layer protocols are Transmission Control Protocol (TCP) and User Datagram Protocol (UDP). TCP is considered one of the most important protocols in the Internet. UDP is a minimal message-oriented Transport Layer protocol. In this paper we have compared the performance of TCP and UDP on the wired network. Network Simulator (NS2) has been used for performance Comparison since it is preferred by the networking research community. Constant bit rate (CBR) traffic used for both TCP and UDP protocols.
In this paper, an algorithm is suggested to train a single layer feedforward neural network to function as a heteroassociative memory. This algorithm enhances the ability of the memory to recall the stored patterns when partially described noisy inputs patterns are presented. The algorithm relies on adapting the standard delta rule by introducing new terms, first order term and second order term to it. Results show that the heteroassociative neural network trained with this algorithm perfectly recalls the desired stored pattern when 1.6% and 3.2% special partially described noisy inputs patterns are presented.
Nowadays, datacenters become more complicated and handle many more users’ requests. Custom protocols are becoming more demanded, and an advanced load balancer to distribute the requests among servers is essential to serve the users quickly and efficiently. P4 introduced a new way to manipulate all packet headers. Therefore, by making use of the P4 ability to decapsulate the transport layer header, a new algorithm of load balancing is proposed. The algorithm has three main parts. First, a TCP/UDP separation is used to separate the flows based on the network layer information about the used protocol in the transport layer. Second, a flow size prediction technique is adopted, which re
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show More