With the high usage of computers and networks in the current time, the amount of security threats is increased. The study of intrusion detection systems (IDS) has received much attention throughout the computer science field. The main objective of this study is to examine the existing literature on various approaches for Intrusion Detection. This paper presents an overview of different intrusion detection systems and a detailed analysis of multiple techniques for these systems, including their advantages and disadvantages. These techniques include artificial neural networks, bio-inspired computing, evolutionary techniques, machine learning, and pattern recognition.
Most Internet-tomography problems such as shared congestion detection depend on network measurements. Usually, such measurements are carried out in multiple locations inside the network and relied on local clocks. These clocks usually skewed with time making these measurements unsynchronized and thereby degrading the performance of most techniques. Recently, shared congestion detection has become an important issue in many computer networked applications such as multimedia streaming and
peer-to-peer file sharing. One of the most powerful techniques that employed in literature is based on Discrete Wavelet Transform (DWT) with cross-correlation operation to determine the state of the congestion. Wavelet transform is used as a de-noisin
Humans use deception daily since it can significantly affect their life and provide a getaway solution for any undesired situation. Deception is either related to low-stakes (e.g. innocuous) or high-stakes (e.g. with harmful situations). Deception investigation importance has increased, and it became a critical issue over the years with the increase of security levels around the globe. Technology has made remarkable achievements in many human life fields, including deception detection. Automated deception detection systems (DDSs) are widely used in different fields, especially for security purposes. The DDS is comprised of multiple stages, each of which should be built/trained to perform intelligently so that the whole system can give th
... Show MoreConcrete columns with hollow-core sections find widespread application owing to their excellent structural efficiency and efficient material utilization. However, corrosion poses a challenge in concrete buildings with steel reinforcement. This paper explores the possibility of using glass fiber-reinforced polymer (GFRP) reinforcement as a non-corrosive and economically viable substitute for steel reinforcement in short square hollow concrete columns. Twelve hollow short columns were meticulously prepared in the laboratory experiments and subjected to pure axial compressive loads until failure. All columns featured a hollow square section with exterior dimensions of (180 × 180) mm and 900 mm height. The columns were categorized into
... Show MorePiperine, a crystalline alkaloid compound isolated from Piper nigrum, piper longum, and other types of piper, has had many fabulous pharmacological advantages for preventing and treating some specific diseases, such as analgesic, anti-inflammatory, hepatoprotective, antimetastatic, antithyroid, immunomodulatory, antitumor, rheumatoid arthritis, osteoarthritis, Alzheimer's, and improving the bioavailability of other drugs. However, its potential for clinical use through oral usage is hindered by water solubility and poor bioavailability. The low level of oral bioavailability is caused by low solubility in water and is photosensitive, susceptible to isomerization by UV light, which causes piperine concentration to decrease. Many different
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreThis Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods
Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreThis research describes a new model inspired by Mobilenetv2 that was trained on a very diverse dataset. The goal is to enable fire detection in open areas to replace physical sensor-based fire detectors and reduce false alarms of fires, to achieve the lowest losses in open areas via deep learning. A diverse fire dataset was created that combines images and videos from several sources. In addition, another self-made data set was taken from the farms of the holy shrine of Al-Hussainiya in the city of Karbala. After that, the model was trained with the collected dataset. The test accuracy of the fire dataset that was trained with the new model reached 98.87%.