Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists of three parts to assess OSM data accuracy: input data, measured and analysis, output results. The output of Matlab program has been represented as graphs. These graphs showed the number of roads during different periods such as each half meter or one meter for length and every half degree for directions, and so on .The results of the compared datasets for two case studies give the large number of roads during the first period. This indicates that the differences between compared datasets were small. The results showed that the case study of Baghdad was more accurate than the case study of holy Karbala.
Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreThe current research aims to measure the impact of the skills of human resources management professionals in information technology in Iraqi private banks, as the skills of human resources management professionals constitute the modern trend of banks' interest in employees with outstanding performance, and the presence of information technology in banks is a prerequisite for dealing with the huge amount of data. And converting it into information to support the decision-maker in light of a complex environment, and the field research problem was the presence of a clear lack of interest in the skills of human resource management professionals and the weak adoption of information technology, which was reflected negatively on the compet
... Show MoreABSTRACT
The study aimed to evaluate the information label of some local pickle products and estimate sodium benzoate therein. 85 samples of locally made pickles were collected from Baghdad city markets and randomly from five different areas in Baghdad it included (Al-Shula, Al-Bayaa, Al-Nahrawan, Al-Taji, and Abu Ghraib), which were divided into groups P1, P2, P3, P4 and P5, respectively, according to those areas, samples information label was scanned and compared with the Iraqi standard specification for the information card of packaged and canned food IQS 230, the results showed that 25.9% of the samples were devoid of the indication card informa
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Home Computer and Information Science 2009 Chapter The Stochastic Network Calculus Methodology Deah J. Kadhim, Saba Q. Jobbar, Wei Liu & Wenqing Cheng Chapter 568 Accesses 1 Citations Part of the Studies in Computational Intelligence book series (SCI,volume 208) Abstract The stochastic network calculus is an evolving new methodology for backlog and delay analysis of networks that can account for statistical multiplexing gain. This paper advances the stochastic network calculus by deriving a network service curve, which expresses the service given to a flow by the network as a whole in terms of a probabilistic bound. The presented network service curve permits the calculation of statistical end-to-end delay and backlog bounds for broad
... Show MoreDue to the huge variety of 5G services, Network slicing is promising mechanism for dividing the physical network resources in to multiple logical network slices according to the requirements of each user. Highly accurate and fast traffic classification algorithm is required to ensure better Quality of Service (QoS) and effective network slicing. Fine-grained resource allocation can be realized by Software Defined Networking (SDN) with centralized controlling of network resources. However, the relevant research activities have concentrated on the deep learning systems which consume enormous computation and storage requirements of SDN controller that results in limitations of speed and accuracy of traffic classification mechanism. To fill thi
... Show MoreAbstract  
... Show MoreIn this paper we present a method to analyze five types with fifteen wavelet families for eighteen different EMG signals. A comparison study is also given to show performance of various families after modifying the results with back propagation Neural Network. This is actually will help the researchers with the first step of EMG analysis. Huge sets of results (more than 100 sets) are proposed and then classified to be discussed and reach the final.
Specialized hardware implementations of Artificial Neural Networks (ANNs) can offer faster execution than general-purpose microprocessors by taking advantage of reusable modules, parallel processes and specialized computational components. Modern high-density Field Programmable Gate Arrays (FPGAs) offer the required flexibility and fast design-to-implementation time with the possibility of exploiting highly parallel computations like those required by ANNs in hardware. The bounded width of the data in FPGA ANNs will add an additional error to the result of the output. This paper derives the equations of the additional error value that generate from bounded width of the data and proposed a method to reduce the effect of the error to give
... Show More