Intrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system. A new features selection method is proposed based on DNA encoding and on DNA keys positions. The current system has three phases, the first phase, is called pre-processing phase, which is used to extract the keys and their positions, the second phase is training phase; the main goal of this phase is to select features based on the key positions that gained from pre-processing phase, and the third phase is the testing phase, which classified the network traffic records as either normal or attack by using specific features. The performance is calculated based on the detection rate, false alarm rate, accuracy, and also on the time that include both encoding time and matching time. All these results are based on using two or three keys, and it is evaluated by using two datasets, namely, KDD Cup 99, and NSL-KDD. The achieved detection rate, false alarm rate, accuracy, encoding time, and matching time for all corrected KDD Cup records (311,029 records) by using two and three keys are equal to 96.97, 33.67, 91%, 325, 13 s, and 92.74, 7.41, 92.71%, 325 and 20 s, respectively. The results for detection rate, false alarm rate, accuracy, encoding time, and matching time for all NSL-KDD records (22,544 records) by using two and three keys are equal to 89.34, 28.94, 81.46%, 20, 1 s and 82.93, 11.40, 85.37%, 20 and 1 s, respectively. The proposed system is evaluated and compared with previous systems and these comparisons are done based on encoding time and matching time. The outcomes showed that the detection results of the present system are faster than the previous ones.
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show More'Steganography is the science of hiding information in the cover media', a force in the context of information sec, IJSR, Call for Papers, Online Journal
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of B
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreA geographic information system (GIS) is a very effective management and analysis tool. Geographic locations rely on data. The use of artificial neural networks (ANNs) for the interpretation of natural resource data has been shown to be beneficial. Back-propagation neural networks are one of the most widespread and prevalent designs. The combination of geographic information systems with artificial neural networks provides a method for decreasing the cost of landscape change studies by shortening the time required to evaluate data. Numerous designs and kinds of ANNs have been created; the majority of them are PC-based service domains. Using the ArcGIS Network Analyst add-on, you can locate service regions around any network
... Show MoreFire incidences are classed as catastrophic events, which mean that persons may experience mental distress and trauma. The development of a robotic vehicle specifically designed for fire extinguishing purposes has significant implications, as it not only addresses the issue of fire but also aims to safeguard human lives and minimize the extent of damage caused by indoor fire occurrences. The primary goal of the AFRC is to undergo a metamorphosis, allowing it to operate autonomously as a specialized support vehicle designed exclusively for the task of identifying and extinguishing fires. Researchers have undertaken the tasks of constructing an autonomous vehicle with robotic capabilities, devising a universal algorithm to be employed
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreFuture generations of wireless communications systems are expected to evolve toward allowing massive ubiquitous connectivity and achieving ultra-reliable and low-latency communications (URLLC) with extremely high data rates. Massive multiple-input multiple-output (m-MIMO) is a crucial transmission technique to fulfill the demands of high data rates in the upcoming wireless systems. However, obtaining a downlink (DL) training sequence (TS) that is feasible for fast channel estimation, i.e., meeting the low-latency communications required by future generations of wireless systems, in m-MIMO with frequency-division-duplex (FDD) when users have different channel correlations is very challenging. Therefore, a low-complexity solution for
... Show More