Reliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data communication processes with sink node. As such, failure in communicating nodes would lead to a significant network void-holes problem. Considering the limited energy resources of nodes in UWSNs along with the heavy load of CHs in the routing process, this paper proposes a void-holes aware and reliable data forwarding strategy (VHARD-FS) in a proactive mode to control data packets delivery from CH nodes to the sink in UWSNs. In the proposed strategy, each CH node is aware of its neighbor’s performance ranking index to conduct a reliable packet transmission to the sink via the most energy-efficient route. Extensive simulation results indicate that the VHARD-FS outperforms existing routing approaches while comparing energy efficiency and network throughput. This study helps to effectively alleviate the resource limitations associated with UWSNs by extending network life and increasing service availability even in a harsh underwater environment.
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreGetting knowledge from raw data has delivered beneficial information in several domains. The prevalent utilizing of social media produced extraordinary quantities of social information. Simply, social media delivers an available podium for employers for sharing information. Data Mining has ability to present applicable designs that can be useful for employers, commercial, and customers. Data of social media are strident, massive, formless, and dynamic in the natural case, so modern encounters grow. Investigation methods of data mining utilized via social networks is the purpose of the study, accepting investigation plans on the basis of criteria, and by selecting a number of papers to serve as the foundation for this arti
... Show MoreAbstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreOne of the most interested problems that recently attracts many research investigations in Protein-protein interactions (PPI) networks is complex detection problem. Detecting natural divisions in such complex networks is proved to be extremely NP-hard problem wherein, recently, the field of Evolutionary Algorithms (EAs) reveals positive results. The contribution of this work is to introduce a heuristic operator, called protein-complex attraction and repulsion, which is especially tailored for the complex detection problem and to enable the EA to improve its detection ability. The proposed heuristic operator is designed to fine-grain the structure of a complex by dividing it into two more complexes, each being distinguished with a core pr
... Show MoreE-Health care system is one of the great technology enhancements via using medical devices through sensors worn or implanted in the patient's body. Wireless Body Area Network (WBAN) offers astonishing help through wireless transmission of patient's data using agreed distance in which it keeps patient's status always controlled by regular transmitting of vital data indications to the receiver. Security and privacy is a major concern in terms of data sent from WBAN and biological sensors. Several algorithms have been proposed through many hypotheses in order to find optimum solutions. In this paper, an encrypting algorithm has been proposed via using hyper-chaotic Zhou system where it provides high security, privacy, efficiency and
... Show MoreIn this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreIn this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant
... Show MoreThe objective of this research is employ the special cases of function trapezoid in the composition of fuzzy sets to make decision within the framework of the theory of games traditional to determine the best strategy for the mobile phone networks in the province of Baghdad and Basra, has been the adoption of different periods of the functions belonging to see the change happening in the matrix matches and the impact that the strategies and decision-making available to each player and the impact on societ
... Show MoreKrawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show More