Preferred Language
Articles
/
5RYK5IsBVTCNdQwCbuPB
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.

Scopus Crossref
View Publication
Publication Date
Tue Jan 01 2019
Journal Name
International Journal Of Advanced Computer Science And Applications
Achieving Flatness: Honeywords Generation Method for Passwords based on user behaviours
...Show More Authors

View Publication
Crossref (3)
Crossref
Publication Date
Mon Feb 20 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Hybrid Algorithm to Protect Computer Networks Based on Human Biometrics and Computer Attributes
...Show More Authors

Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu

... Show More
View Publication Preview PDF
Publication Date
Wed Nov 16 2016
Journal Name
Eurasip Journal On Wireless Communications And Networking
Evaluation of efficient vehicular ad hoc networks based on a maximum distance routing algorithm
...Show More Authors

Traffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-ho

... Show More
View Publication Preview PDF
Scopus (26)
Crossref (18)
Scopus Clarivate Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Iaes International Journal Of Artificial Intelligence (ij-ai)
Innovations in t-way test creation based on a hybrid hill climbing-greedy algorithm
...Show More Authors

<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T

... Show More
View Publication
Scopus (3)
Crossref (4)
Scopus Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Journal Of Southwest Jiaotong University
IMPROVED STRUCTURE OF DATA ENCRYPTION STANDARD ALGORITHM
...Show More Authors

The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene

... Show More
Publication Date
Wed Oct 28 2015
Journal Name
Journal Of Mathematics And System Science
Simulating Particle Swarm Optimization Algorithm to Estimate Likelihood Function of ARMA(1, 1) Model
...Show More Authors

Crossref
Publication Date
Sat Dec 31 2022
Journal Name
International Journal Of Intelligent Engineering And Systems
Using Three-Dimensional Logistic Equations and Glowworm Swarm Optimization Algorithm to Generate S-Box
...Show More Authors

View Publication
Scopus (1)
Scopus Crossref
Publication Date
Tue Jan 01 2019
Journal Name
Advances On Computational Intelligence In Energy
A Theoretical Framework for Big Data Analytics Based on Computational Intelligent Algorithms with the Potential to Reduce Energy Consumption
...Show More Authors

Within the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo

... Show More
View Publication
Scopus (1)
Scopus Crossref
Publication Date
Thu Feb 09 2023
Journal Name
Artificial Intelligence Review
Community detection model for dynamic networks based on hidden Markov model and evolutionary algorithm
...Show More Authors

Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over

... Show More
View Publication
Scopus (5)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Sun Jun 12 2011
Journal Name
Baghdad Science Journal
An algorithm for binary codebook design based on the average bitmap replacement error (ABPRE)
...Show More Authors

In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates

... Show More
View Publication Preview PDF
Crossref