Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists of three parts to assess OSM data accuracy: input data, measured and analysis, output results. The output of Matlab program has been represented as graphs. These graphs showed the number of roads during different periods such as each half meter or one meter for length and every half degree for directions, and so on .The results of the compared datasets for two case studies give the large number of roads during the first period. This indicates that the differences between compared datasets were small. The results showed that the case study of Baghdad was more accurate than the case study of holy Karbala.
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThis paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
The investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreThe main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreImage Fusion Using A Convolutional Neural Network
In this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreThe selective information broadcasting service is one of the important services in libraries and information centers, as it is the link between the source and the beneficiary and between related sources, as it links similar sources through keywords and then sends them to beneficiaries, which contributes to reducing the time and effort spent by beneficiaries in obtaining sources or information. Therefore, the application of this service is an important matter and gives a positive indicator in the progress of the library towards the integration of its services. From this standpoint, this research came to answer some questions, including: 1. What are the outlets for beneficiaries (the research community) in obtaining information sources? 2. Wh
... Show MoreThis research dealt with shedding light on the nature of material misrepresentations, in addition to knowing the extent to which the quality of accounting information systems contributes to reducing material misrepresentations On the theoretical side, a number of sources were relied upon in dealing with the research problem and presentation of the topic, while in the practical side, it was relied on the questionnaire form, where the research sample was (accountants and auditors), where 50 forms were distributed and 50 were received, and the data was analyzed and hypotheses tested through the program Statistical spss to show the relationship between the variables. The research reached a number of conclusions, the most important of which is t
... Show MoreThat the structural changes in the environment, business and finance and the spread of business and the diversity of transactions between economic organizations and breadth of a commercial scale in the world have left their clear on the need to keep up with the accounting for these variables as one of the social sciences affect and are affected by the surrounding environment because of the various economic and social factors, technical, legal and others.
As a result of these variables emerged a new field of accounting called Forensic Accounting, which involves the use of expertise of multiple pour in the end to the accounting profession, where the Forensic Accounting cover a large area of disciplines including strengthening
... Show MoreThe organization uses many techniques and methods to ensure that they will succeed and adapted with velocity change in the internal and external environment by decision taking, especially strategic decisions.
Strategic decisions are very important for organization success because it can predict the future and deal with uncertainty, in this circumstances they need accurate and comprehensive information to make effective strategic decision.
To achieve that purpose it must owned successful Strategic Information System ( SIS ) and determined the critical success factors for this system ,which can assisted the worker to focus on the important activities to develop it.
... Show More