In today's world, the science of bioinformatics is developing rapidly, especially with regard to the analysis and study of biological networks. Scientists have used various nature-inspired algorithms to find protein complexes in protein-protein interaction (PPI) networks. These networks help scientists guess the molecular function of unknown proteins and show how cells work regularly. It is very common in PPI networks for a protein to participate in multiple functions and belong to many complexes, and as a result, complexes may overlap in the PPI networks. However, developing an efficient and reliable method to address the problem of detecting overlapping protein complexes remains a challenge since it is considered a complex and hard optimization problem. One of the main difficulties in identifying overlapping protein complexes is the accuracy of the partitioning results. In order to accurately identify the overlapping structure of protein complexes, this paper has proposed an overlapping complex detection algorithm termed OCDPSO-Net, which is based on PSO-Net (a well-known modified version of the particle swarm optimization algorithm). The framework of the OCDPSO-Net method consists of three main steps, including an initialization strategy, a movement strategy for each particle, and enhancing search ability in order to expand the solution space. The proposed algorithm has employed the partition density concept for measuring the partitioning quality in PPI network complexes and tried to optimize the value of this quantity by applying the line graph concept of the original graph representing the protein interaction network. The OCDPSO-Net algorithm is applied to a Collins PPI network and the obtained results are compared with different state-of-the-art algorithms in terms of precision ( ), recall ( ), and F-measure ( ). Experimental results confirm that the proposed algorithm has good clustering performance and has outperformed most of the existing recent overlapping algorithms. .
In this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreCredit card fraud has become an increasing problem due to the growing reliance on electronic payment systems and technological advances that have improved fraud techniques. Numerous financial institutions are looking for the best ways to leverage technological advancements to provide better services to their end users, and researchers used various protection methods to provide security and privacy for credit cards. Therefore, it is necessary to identify the challenges and the proposed solutions to address them. This review provides an overview of the most recent research on the detection of fraudulent credit card transactions to protect those transactions from tampering or improper use, which includes imbalance classes, c
... Show MoreCutaneous leishmaniasis is one of endemic diseases in Iraq. It is considered as widely health problem and is an uncontrolled disease. The aim of the study is to identify of Leishmania species that cause skin lesions among patients in Thi-Qar Province, South of Iraq, also to detect some virulence factors of L. tropica. This study includes three local locations, Al-Hussein Teaching, Suq Al-Shyokh General and Al-Shatrah General Hospitals in Province for the period from the beginning of December 2018 to the end of September 2019. The samples were collected from 80 patients suffering from cutaneous leishmaniasis, both genders, different ages, various residence places and single and multiple lesions. Nested-PCR technique was
... Show MoreIn this paper, a Modified Weighted Low Energy Adaptive Clustering Hierarchy (MW-LEACH) protocol is implemented to improve the Quality of Service (QoS) in Wireless Sensor Network (WSN) with mobile sink node. The Quality of Service is measured in terms of Throughput Ratio (TR), Packet Loss Ratio (PLR) and Energy Consumption (EC). The protocol is implemented based on Python simulation. Simulation Results showed that the proposed protocol provides better Quality of Service in comparison with Weighted Low Energy Cluster Hierarchy (W-LEACH) protocol by 63%.
This article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreThe unpredictable and huge data generation nowadays by smart computing devices like (Sensors, Actuators, Wi-Fi routers), to handle and maintain their computational processing power in real time environment by centralized cloud platform is difficult because of its limitations, issues and challenges, to overcome these, Cisco introduced the Fog computing paradigm as an alternative for cloud-based computing. This recent IT trend is taking the computing experience to the next level. It is an extended and advantageous extension of the centralized cloud computing technology. In this article, we tried to highlight the various issues that currently cloud computing is facing. Here
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show More