This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estimation through working with rough set theory. The results obtained from most code sets show that Bees algorithm better than ID3 in decreasing the number of extracted rules without affecting the accuracy and increasing the accuracy ratio of null values estimation, especially when the number of null values is increasing
The importance of topology as a tool in preference theory is what motivates this study in which we characterize topologies generating by digraphs. In this paper, we generalized the notions of rough set concepts using two topological structures generated by out (resp. in)-degree sets of vertices on general digraph. New types of topological rough sets are initiated and studied using new types of topological sets. Some properties of topological rough approximations are studied by many propositions.
Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show MoreWith the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica
... Show MoreThe purpose of current study is to analyze the computer textbooks content for intermediate stage in Iraq according to the theory of multiple intelligence. By answering the following question “what is the percentage of availability of multiple intelligence in the content of the computer textbooks on intermediate stage (grade I, II) for the academic year (2017-2018)? The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea for registration. The research tool was prepared according the Gardner’s classification of multiple intelligence. It has proven validity and reliability. The study found the percentage of multiple intelligence in the content of computer textbooks for the in
... Show MoreAstronomy image is regarded main source of information to discover outer space, therefore to know the basic contain for galaxy (Milky way), it was classified using Variable Precision Rough Sets technique to determine the different region within galaxy according different color in the image. From classified image we can determined the percentage for each class and then what is the percentage mean. In this technique a good classified image result and faster time required to done the classification process.
The aim of robot path planning is to search for a safe path for the mobile robot. Even though there exist various path planning algorithms for mobile robots, yet only a few are optimized. The optimized algorithms include the Particle Swarm Optimization (PSO) that finds the optimal path with respect to avoiding the obstacles while ensuring safety. In PSO, the sub-optimal solution takes place frequently while finding a solution to the optimal path problem. This paper proposes an enhanced PSO algorithm that contains an improved particle velocity. Experimental results show that the proposed Enhanced PSO performs better than the standard PSO in terms of solution’s quality. Hence, a mobile robot implementing the proposed algorithm opera
... Show MoreThe need for information web-searching is needed by many users nowadays. They use the search engines to input their query or question and wait for the answer or best search results. As results to user query the search engines many times may be return irrelevant pages or not related to information need. This paper presents a proposed model to provide the user with efficient and effective result through search engine, based on modified chicken swarm algorithm and cosine similarity to eliminate and delete irrelevant pages(outliers) from the ranked list results, and to improve the results of the user's query . The proposed model is applied to Arabic dataset and use the ZAD corpus dataset for 27
... Show MoreThe basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.