Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and triggers an alarm if intruder signs in using a honeyword. Many honeyword generation approaches have been proposed by previous research, all with limitations to their honeyword generation processes, limited success in providing all required honeyword features, and susceptibility to many honeyword issues. This work will present a novel honeyword generation method that uses a proposed discrete salp swarm algorithm. The salp swarm algorithm (SSA) is a bio-inspired metaheuristic optimization algorithm that imitates the swarming behavior of salps in their natural environment. SSA has been used to solve a variety of optimization problems. The presented honeyword generation method will improve the generation process, improve honeyword features, and overcome the issues of previous techniques. This study will demonstrate numerous previous honeyword generating strategies, describe the proposed methodology, examine the experimental results, and compare the new honeyword production method to those proposed in previous research.
Evolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreRecently Genetic Algorithms (GAs) have frequently been used for optimizing the solution of estimation problems. One of the main advantages of using these techniques is that they require no knowledge or gradient information about the response surface. The poor behavior of genetic algorithms in some problems, sometimes attributed to design operators, has led to the development of other types of algorithms. One such class of these algorithms is compact Genetic Algorithm (cGA), it dramatically reduces the number of bits reqyuired to store the poulation and has a faster convergence speed. In this paper compact Genetic Algorithm is used to optimize the maximum likelihood estimator of the first order moving avergae model MA(1). Simulation results
... Show MoreProviding useful information in estimating the amount and timing and the degree of uncertainty concerning the future cash flows is one of the three main objectives of the financial reporting system, which is done through the main financial statements. The interest on standard-setting bodies in the forecasting of future cash flows, especially Financial Accounting Standards Board (FASB) explain under Accounting Standard (1) of the year 1978 "Objectives of Financial Reporting by Business Enterprises", paragraph (37) thereof that accounting profits better than cash flows when forecasting future cash flows, In contrast, IAS (7) as amended in 1992 aims to compel economic units to prepare statement of c
... Show MoreIn this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.
The Purpose of this study is mainly to improve the competitive position of products economic units using technique target cost and method reverse engineering and through the application of technique and style on one of the public sector companies (general company for vegetable oils) which are important in the detection of prices accepted in the market for items similar products and processing the problem of high cost which attract managerial and technical leadership to the weakness that need to be improved through the introduction of new innovative solutions which make appropriate change to satisfy the needs of consumers in a cheaper way to affect the decisions of private customer to buy , especially of purchase private economic units to
... Show MoreIn this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
The main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThe aim of this research is to employ starch as a stabilizing and reducing agent in the production of CdS nanoparticles with less environmental risk, easy scaling, stability, economical feasibility, and suitability for large-scale production. Nanoparticles of CdS have been successfully produced by employing starch as a reducing agent in a simple green synthesis technique and then doped with Sn in certain proportions (1%, 2%, 3%, 4%, and 5%).According to the XRD data, the samples were crystallized in a hexagonal pattern, because the average crystal size of pure CdS is 5.6nm and fluctuates in response to the changes in doping concentration 1, 2, 3, 4, 5 %wt Sn, to become 4.8, 3.9, 11.5, 13.1, 9.3 nm respectively. An increase in crystal
... Show More