Wireless communications are characterized by their fastest growth in history, as they used ever-evolving and renewed technologies, which have allowed them to spread widely. Every day, communication technology introduces a new invention with features that differ from its predecessor. Bell Laboratories first suggested mobile wireless communication services to the general population in the late 1940s. Still, it wasn't easy at that time to use on a large scale due to its high costs. This paper aims to describe the state of cellular mobile networks; by comparing the sources of electromagnetic pollution caused by these networks, measure the level of power density in some residential areas, and compare them with international standards adopted in determining the level of power density by calculating the effect of the transmitted power and the angle of transmission of the antenna from the station. The importance of the paper lies in the fact that it investigates the levels of non-ionizing radiation produced by cellular mobile networks and the identification of other types of pollution caused by these networks.
Image Fusion Using A Convolutional Neural Network
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreClassification of network traffic is an important topic for network management, traffic routing, safe traffic discrimination, and better service delivery. Traffic examination is the entire process of examining traffic data, from intercepting traffic data to discovering patterns, relationships, misconfigurations, and anomalies in a network. Between them, traffic classification is a sub-domain of this field, the purpose of which is to classify network traffic into predefined classes such as usual or abnormal traffic and application type. Most Internet applications encrypt data during traffic, and classifying encrypted data during traffic is not possible with traditional methods. Statistical and intelligence methods can find and model traff
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreThis paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
The investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreKeywords provide the reader with a summary of the contents of the document and play a significant role in information retrieval systems, especially in search engine optimization and bibliographic databases. Furthermore keywords help to classify the document into the related topic. Keywords extraction included manual extracting depends on the content of the document or article and the judgment of its author. Manual extracting of keywords is costly, consumes effort and time, and error probability. In this research an automatic Arabic keywords extraction model based on deep learning algorithms is proposed. The model consists of three main steps: preprocessing, feature extraction and classification to classify the document
... Show MoreIn this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreAmong many problems that reduced the performance of the network, especially Wide Area Network, congestion is one of these, which is caused when traffic request reaches or exceeds the available capacity of a route, resulting in blocking and less throughput per unit time. Congestion management attributes try to manage such cases. The work presented in this paper deals with an important issue that is the Quality of Service (QoS) techniques. QoS is the combination effect on service level, which locates the user's degree of contentment of the service. In this paper, packet schedulers (FIFO, WFQ, CQ and PQ) were implemented and evaluated under different applications with different priorities. The results show that WFQ scheduler gives acceptable r
... Show More