Network security is defined as a set of policies and actions taken by a network administrator in order to prevent unauthorized access, penetrated the defenses and infiltrated the network from unnecessary intervention. The network security also involves granting access to data using a pre-defined policy. A network firewall, on the other hand, is a network appliance that controls incoming and outgoing traffic by examining the traffic flowing through the network. This security measure establishes a secure wall [firewall] between a trusted internal network and the outside world were a security threat in shape of a hacker or a virus might have existed
Achieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number o
... Show MoreThe paper presents a neural synchronization into intensive study in order to address challenges preventing from adopting it as an alternative key exchange algorithm. The results obtained from the implementation of neural synchronization with this proposed system address two challenges: namely the verification of establishing the synchronization between the two neural networks, and the public initiation of the input vector for each party. Solutions are presented and mathematical model is developed and presented, and as this proposed system focuses on stream cipher; a system of LFSRs (linear feedback shift registers) has been used with a balanced memory to generate the key. The initializations of these LFSRs are neural weights after achiev
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreThis study is planned with the aim of constructing models that can be used to forecast trip production in the Al-Karada region in Baghdad city incorporating the socioeconomic features, through the use of various statistical approaches to the modeling of trip generation, such as artificial neural network (ANN) and multiple linear regression (MLR). The research region was split into 11 zones to accomplish the study aim. Forms were issued based on the needed sample size of 1,170. Only 1,050 forms with responses were received, giving a response rate of 89.74% for the research region. The collected data were processed using the ANN technique in MATLAB v20. The same database was utilized to
Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad
... Show MoreInformation from 54 Magnetic Resonance Imaging (MRI) brain tumor images (27 benign and 27 malignant) were collected and subjected to multilayer perceptron artificial neural network available on the well know software of IBM SPSS 17 (Statistical Package for the Social Sciences). After many attempts, automatic architecture was decided to be adopted in this research work. Thirteen shape and statistical characteristics of images were considered. The neural network revealed an 89.1 % of correct classification for the training sample and 100 % of correct classification for the test sample. The normalized importance of the considered characteristics showed that kurtosis accounted for 100 % which means that this variable has a substantial effect
... Show MoreMalaysia has been supported by one of the high-speed fiber internet connections called TM UniFi. TM UniFi is very familiar to be used as a medium to apply Small Office Home Office (SOHO) concept due to the COVID-19 pandemic. Most of the communication vendors offer varieties of network services to fulfill customers' needs and satisfaction during the pandemic. Quality of Services is queried by most users by the fact of increased on users from time to time. Therefore, it is crucial to know the network performance contrary to the number of devices connected to the TM UniFi network. The main objective of this research is to analyze TM UniFi performance with the impact of multiple device connections or users' services. The study was conducted
... Show MoreElliptic Curve Cryptography (ECC) is one of the public key cryptosystems that works based on the algebraic models in the form of elliptic curves. Usually, in ECC to implement the encryption, the encoding of data must be carried out on the elliptic curve, which seems to be a preprocessing step. Similarly, after the decryption a post processing step must be conducted for mapping or decoding the corresponding data to the exact point on the elliptic curves. The Memory Mapping (MM) and Koblitz Encoding (KE) are the commonly used encoding models. But both encoding models have drawbacks as the MM needs more memory for processing and the KE needs more computational resources. To overcome these issues the proposed enhanced Koblitz encodi
... Show MoreThe present article delves into the examination of groundwater quality, based on WQI, for drinking purposes in Baghdad City. Further, for carrying out the investigation, the data was collected from the Ministry of Water Resources of Baghdad, which represents water samples drawn from 114 wells in Al-Karkh and Al-Rusafa sides of Baghdad city. With the aim of further determining WQI, four water parameters such as (i) pH, (ii) Chloride (Cl), (iii) Sulfate (SO4), and (iv) Total dissolved solids (TDS), were taken into consideration. According to the computed WQI, the distribution of the groundwater samples, with respect to their quality classes such as excellent, good, poor, very poor and unfit for human drinking purpose, was found to be
... Show More