<p><span>A Botnet is one of many attacks that can execute malicious tasks and develop continuously. Therefore, current research introduces a comparison framework, called BotDetectorFW, with classification and complexity improvements for the detection of Botnet attack using CICIDS2017 dataset. It is a free online dataset consist of several attacks with high-dimensions features. The process of feature selection is a significant step to obtain the least features by eliminating irrelated features and consequently reduces the detection time. This process implemented inside BotDetectorFW using two steps; data clustering and five distance measure formulas (cosine, dice, driver & kroeber, overlap, and pearson correlation) using C#, followed by selecting the best N features used as input into four classifier algorithms evaluated using machine learning (WEKA); multilayerperceptron, JRip, IBK, and random forest. In BotDetectorFW, the thoughtful and diligent cleaning of the dataset within the preprocessing stage beside the normalization, binary clustering of its features, followed by the adapting of feature selection based on suitable feature distance techniques, and finalized by testing of selected classification algorithms. All together contributed in satisfying the high-performance metrics using fewer features number (8 features as a minimum) compared to and outperforms other methods found in the literature that adopted (10 features or higher) using the same dataset. Furthermore, the results and performance evaluation of BotDetectorFM shows a competitive impact in terms of classification accuracy (ACC), precision (Pr), recall (Rc), and f-measure (F1) metrics.</span></p>
A three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreSince there is no market for bond issuance by companies in the Iraqi market and the difficulty of borrowing, companies must resort to proprietary financing to finance their investments. However, in the framework of the literature of financial management, the type of financing used by the company sends signals to investors and therefore reflected on the market value. Therefore, the problem of the study revolves around the variables of the study (Equity financing within the framework the signal theory, price of common stock in the Iraqi market).
The study aims to verify the impact of the capital increase through the issuance of new stock on the price of
... Show MoreComputer systems and networks are being used in almost every aspect of our daily life; as a result the security threats to computers and networks have also increased significantly. Traditionally, password-based user authentication is widely used to authenticate legitimate user in the current system0T but0T this method has many loop holes such as password sharing, shoulder surfing, brute force attack, dictionary attack, guessing, phishing and many more. The aim of this paper is to enhance the password authentication method by presenting a keystroke dynamics with back propagation neural network as a transparent layer of user authentication. Keystroke Dynamics is one of the famous and inexpensive behavioral biometric technologies, which identi
... Show MoreWhen optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show MoreSorting and grading agricultural crops using manual sorting is a cumbersome and arduous process, in addition to the high costs and increased labor, as well as the low quality of sorting and grading compared to automatic sorting. the importance of deep learning, which includes the artificial neural network in prediction, also shows the importance of automated sorting in terms of efficiency, quality, and accuracy of sorting and grading. artificial neural network in predicting values and choosing what is good and suitable for agricultural crops, especially local lemons.
The analytical study of optical bistability is concerned in a fully
optimized laser Fabry-Perot system. The related phenomena of
switching dynamics and optimization procedure are also included.
From the steady state of optical bistability equation can plot the
incident intensity versus the round trip phase shift (φ) for different
values of dark mistuning
12
,
6
,
3
,
1.5
0 , o
or finesse (F= 1, 5, 20,
100). In order to obtain different optical bistable loops. The inputoutput
characteristic for a nonlinear Fabry-Perot etalon of a different
values of finesse (F) and using different initial detuning (φ0) are used
in this rese
In this work, animal bones with different shapes and sizes were used to study the characteristics of the ground penetrating Radar system wares reflected by these bones. These bones were buried underground in different depths and surrounding media. The resulting data showed that the detection of buried bones with the GPR technology is highly dependent upon the surrounding media that the bones were buried in. Humidity is the main source of signal loss in such application because humidity results in low signal-to-noise ratio which leads to inability to distinguish between the signal reflected by bones from that reflected by the dopes in the media such as rock .
In Computer-based applications, there is a need for simple, low-cost devices for user authentication. Biometric authentication methods namely keystroke dynamics are being increasingly used to strengthen the commonly knowledge based method (example a password) effectively and cheaply for many types of applications. Due to the semi-independent nature of the typing behavior it is difficult to masquerade, making it useful as a biometric. In this paper, C4.5 approach is used to classify user as authenticated user or impostor by combining unigraph features (namely Dwell time (DT) and flight time (FT)) and digraph features (namely Up-Up Time (UUT) and Down-Down Time (DDT)). The results show that DT enhances the performance of digraph features by i
... Show More