Honeywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and triggers an alarm if intruder signs in using a honeyword. Many honeyword generation approaches have been proposed by previous research, all with limitations to their honeyword generation processes, limited success in providing all required honeyword features, and susceptibility to many honeyword issues. This work will present a novel honeyword generation method that uses a proposed discrete salp swarm algorithm. The salp swarm algorithm (SSA) is a bio-inspired metaheuristic optimization algorithm that imitates the swarming behavior of salps in their natural environment. SSA has been used to solve a variety of optimization problems. The presented honeyword generation method will improve the generation process, improve honeyword features, and overcome the issues of previous techniques. This study will demonstrate numerous previous honeyword generating strategies, describe the proposed methodology, examine the experimental results, and compare the new honeyword production method to those proposed in previous research.
The aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial
... Show MoreIn light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimen
... Show MoreIn modern hydraulic control systems, the trend in hydraulic power applications is to improve efficiency and performance. “Proportional valve” is generally applied to pressure, flow and directional-control valves which continuously convert a variable input signal into a smooth and proportional hydraulic output signal. It creates a variable resistance (orifice) upstream and downstream of a hydraulic actuator, and is meter in/meter out circuit and hence pressure drop, and power losses are inevitable. If velocity (position) feedback is used, flow pattern control is possible. Without aforementioned flow pattern, control is very “loose” and relies on “visual” feed back by the operator. At this point, we should examine how this valv
... Show MoreThe microdrilling and nanodrilling holes are produced by a Q-switched Nd :YAG laser (1064 nm) interaction with 8009 Al alloy using nanoparticles. Two kinds of nanoparticles were used with this alloy. These nanoparticles are tungsten carbide (WC) and silica carbide (SiC). In this work, the microholes and nanoholes have been investigated with different laser pulse energies (600, 700 and 800)mJ, different repetition rates (5Hz and 10Hz) and different concentration of nanoparticles (90%, 50% and 5% ). The results indicate that the microholes and nanoholes have been achieved when the laser pulse energy is 600 mJ, laser repetition rate is 5Hz, and the concentration of the nanoparticles (for the two types of n
... Show MoreA common approach to the color image compression was started by transform
the red, green, and blue or (RGB) color model to a desire color model, then applying
compression techniques, and finally retransform the results into RGB model In this
paper, a new color image compression method based on multilevel block truncation
coding (MBTC) and vector quantization is presented. By exploiting human visual
system response for color, bit allocation process is implemented to distribute the bits
for encoding in more effective away.
To improve the performance efficiency of vector quantization (VQ),
modifications have been implemented. To combines the simple computational and
edge preservation properties of MBTC with high c
In this paper, a new technique is offered for solving three types of linear integral equations of the 2nd kind including Volterra-Fredholm integral equations (LVFIE) (as a general case), Volterra integral equations (LVIE) and Fredholm integral equations (LFIE) (as special cases). The new technique depends on approximating the solution to a polynomial of degree and therefore reducing the problem to a linear programming problem(LPP), which will be solved to find the approximate solution of LVFIE. Moreover, quadrature methods including trapezoidal rule (TR), Simpson 1/3 rule (SR), Boole rule (BR), and Romberg integration formula (RI) are used to approximate the integrals that exist in LVFIE. Also, a comparison between those
... Show MoreSingle Point Incremental Forming (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The sheet part is locally deformed through horizontal slices. The moving locus of forming tool (called as toolpath) in these slices constructed to the finished part was performed by the CNC technology. The toolpath was created directly from CAD model of final product. The forming tool is a Ball-end forming tool, which was moved along the toolpath while the edges of sheet material were clamped rigidly on fixture.
This paper presented an investigation study of thinning distribution of a conical shapes carried out by incremental forming and the validation of finite element method to evaluate the limits of the p
... Show MoreEvolutionary algorithms are better than heuristic algorithms at finding protein complexes in protein-protein interaction networks (PPINs). Many of these algorithms depend on their standard frameworks, which are based on topology. Further, many of these algorithms have been exclusively examined on networks with only reliable interaction data. The main objective of this paper is to extend the design of the canonical and topological-based evolutionary algorithms suggested in the literature to cope with noisy PPINs. The design of the evolutionary algorithm is extended based on the functional domain of the proteins rather than on the topological domain of the PPIN. The gene ontology annotation in each molecular function, biological proce
... Show More
The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.
And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)
... Show MoreRadiation treatment has long been the conventional approach for treating nasopharyngeal cancer (NPC) tumors due to its anatomic features, biological characteristics, and radiosensitivity. The most common treatment for nasopharyngeal carcinoma is radiotherapy. This study aimed to assess the better quality of radiotherapy treatment techniques using intensity-modulated radiotherapy (IMRT) and volumetric-modulated arc therapy (VMAT). The VMAT and IMRT are comparative techniques. Forty patients with nasopharyngeal carcinoma and forwarded for radiotherapy were treated with both advanced techniques, IMRT and VMAT, using eclipse software from Varian. The x-ray energy was set at 6 MV. The total prescribed dose was 70 Gy. The results show that the
... Show More