This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estimation through working with rough set theory. The results obtained from most code sets show that Bees algorithm better than ID3 in decreasing the number of extracted rules without affecting the accuracy and increasing the accuracy ratio of null values estimation, especially when the number of null values is increasing
In this paper, game theory was used and applied to the transport sector in Iraq, as this sector includes two axes, the public transport axis and the second axis the private transport axis, as each of these axes includes several types of transport, namely (sea transport, air transport, land transport, transport by rail, port transport) and the travel and tourism sector, as public transport lacks this sector, as the competitive advantage matrix for the transport sector was formed and after applying the MinMax-MaxMin principle to the matrix in all its stages, it was found that there was an equilibrium point except for the last stage where the equilibrium point was not available Therefore, the use of the linear programming method was
... Show MoreThe research aims to estimate missing values using covariance analysis method Coons way to the variable response or dependent variable that represents the main character studied in a type of multi-factor designs experiments called split block-design (SBED) so as to increase the accuracy of the analysis results and the accuracy of statistical tests based on this type of designs. as it was noted in the theoretical aspect to the design of dissident sectors and statistical analysis have to analyze the variation in the experience of experiment )SBED) and the use of covariance way coons analysis according to two methods to estimate the missing value, either in the practical side of it has been implemented field experiment wheat crop in
... Show MoreIn real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson†and the “Expectation-Maximization†techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i
... Show MoreArtificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its ex
The biometric-based keys generation represents the utilization of the extracted features from the human anatomical (physiological) traits like a fingerprint, retina, etc. or behavioral traits like a signature. The retina biometric has inherent robustness, therefore, it is capable of generating random keys with a higher security level compared to the other biometric traits. In this paper, an effective system to generate secure, robust and unique random keys based on retina features has been proposed for cryptographic applications. The retina features are extracted by using the algorithm of glowworm swarm optimization (GSO) that provides promising results through the experiments using the standard retina databases. Additionally, in order t
... Show Moren this study, data or X-ray images Fixable Image Transport System (FITS) of objects were analyzed, where energy was collected from the body by several sensors; each sensor receives energy within a specific range, and when energy was collected from all sensors, the image was formed carrying information about that body. The images can be transferred and stored easily. The images were analyzed using the DS9 program to obtain a spectrum for each object,an energy corresponding to the photons collected per second. This study analyzed images for two types of objects (globular and open clusters). The results showed that the five open star clusters contain roughly t
... Show MoreRumors are typically described as remarks whose true value is unknown. A rumor on social media has the potential to spread erroneous information to a large group of individuals. Those false facts will influence decision-making in a variety of societies. In online social media, where enormous amounts of information are simply distributed over a large network of sources with unverified authority, detecting rumors is critical. This research proposes that rumor detection be done using Natural Language Processing (NLP) tools as well as six distinct Machine Learning (ML) methods (Nave Bayes (NB), random forest (RF), K-nearest neighbor (KNN), Logistic Regression (LR), Stochastic Gradient Descent (SGD) and Decision Tree (
... Show MoreDetermination of the sites of geographical coordinates with high accuracy and in short time is very important in many applications, including: air and sea navigation, and in the uses geodetic surveys. Today, the Global Positioning System (GPS) plays an important role in performing this task. The datum used for GPS positioning is called World Geodetic System 1984 (WGS84). It consists of a three-dimensional Cartesian coordinate system and an associated ellipsoid so that WGS84 positions describe coordinates as latitude, longitude and ellipsoid height (h) coordinates, with respect to the center of mass of the Earth This study develops a mathematical model for geomantic measurement correction for ellipsoidal heights (h) between two different
... Show More