Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists of three parts to assess OSM data accuracy: input data, measured and analysis, output results. The output of Matlab program has been represented as graphs. These graphs showed the number of roads during different periods such as each half meter or one meter for length and every half degree for directions, and so on .The results of the compared datasets for two case studies give the large number of roads during the first period. This indicates that the differences between compared datasets were small. The results showed that the case study of Baghdad was more accurate than the case study of holy Karbala.
Sewer sediment deposition is an important aspect as it relates to several operational and environmental problems. It concerns municipalities as it affects the sewer system and contributes to sewer failure which has a catastrophic effect if happened in trunks or interceptors. Sewer rehabilitation is a costly process and complex in terms of choosing the method of rehabilitation and individual sewers to be rehabilitated. For such a complex process, inspection techniques assist in the decision-making process; though, it may add to the total expenditure of the project as it requires special tools and trained personnel. For developing countries, Inspection could prohibit the rehabilitation proceeds. In this study, the researchers propos
... Show MoreThe dynamic development of computer and software technology in recent years was accompanied by the expansion and widespread implementation of artificial intelligence (AI) based methods in many aspects of human life. A prominent field where rapid progress was observed are high‐throughput methods in biology that generate big amounts of data that need to be processed and analyzed. Therefore, AI methods are more and more applied in the biomedical field, among others for RNA‐protein binding sites prediction, DNA sequence function prediction, protein‐protein interaction prediction, or biomedical image classification. Stem cells are widely used in biomedical research, e.g., leukemia or other disease studies. Our proposed approach of
... Show MoreThe turning process has various factors, which affecting machinability and should be investigated. These are surface roughness, tool life, power consumption, cutting temperature, machining force components, tool wear, and chip thickness ratio. These factors made the process nonlinear and complicated. This work aims to build neural network models to correlate the cutting parameters, namely cutting speed, depth of cut and feed rate, to the machining force and chip thickness ratio. The turning process was performed on high strength aluminum alloy 7075-T6. Three radial basis neural networks are constructed for cutting force, passive force, and feed force. In addition, a radial basis network is constructed to model the chip thickness ratio. T
... Show MoreThe research aims to show the impact that the information of the supporting bodies can have and its dimensions represented by (information credibility, efficiency and effectiveness of information, cooperation with the tax administration, obligating the taxpayer, accuracy and completeness of information and the appropriate time) in tax inventory, as well as clarifying the moral differences in The response of the surveyed sample according to the personal variables represented by (gender, educational attainment, scientific specialization, job title, years of service), and the descriptive analytical approach was adopted and in light of it, the questionnaire was designed as a main tool in collecting data from the sample of (80)
... Show MoreThe interaction in the city is reflected in the movement of people motivated by their activities and their economic and social goals, which include many variables subject to the planning process in the interpretation of this movement and the mapping of trends of transport intensity through the concept of transport function and its functional relationships with the uses of the earth in the sustainability and effectiveness of the movement of transport and Economic activity and population movement. Transport planners are concerned with the requirements of land use, which are linked to and included in the transport planning process as a factor for the future transport needs. There is a strong relationship between the transport system
... Show MoreThis paper presents a study of a syndrome coding scheme for different binary linear error correcting codes that refer to the code families such as BCH, BKLC, Golay, and Hamming. The study is implemented on Wyner’s wiretap channel model when the main channel is error-free and the eavesdropper channel is a binary symmetric channel with crossover error probability (0 < Pe ≤ 0.5) to show the security performance of error correcting codes while used in the single-staged syndrome coding scheme in terms of equivocation rate. Generally, these codes are not designed for secure information transmission, and they have low equivocation rates when they are used in the syndrome coding scheme. Therefore, to improve the transmiss
... Show MoreThis study aims to identify both the importance of using (LinkedIn) and its drawbacks for researchers and specialists in the field of information and knowledge technologies. The study relied mainly on the statistical method (analytical method) from the collection of data tools (questionnaire) that was distributed electronically (Google Forms) to the sample community of (55) instructors. The feedback received illustrates that (46) instructors among those who participated in the questionnaire subscribed to (LinkedIn) and the rest did not. Their data was analyzed statistically, and the general arithmetic mean and the hypothetical mean was extracted for them to achieve the objectives of the study and prove their hypotheses. The site positively
... Show MoreThe aim of this study was to identify the rate of return of the stock through the financial information disclosed by the financial statements of companies both services and insurance included in Iraqi market for securities . The study used a descriptive statistical methods and the correlation matrix for the independent factors , in addition to a regression model for data analysis and hypothesis . Model included a number of independent variables , which was measured in the size of company (sales or revenue) , and the leverage , in addition to the structure of assets and the book value of owners' equity in the company , as well as the general price index .Based on the data of (11)companies and for three years, showed the result
... Show Morethis research is to identify the level of information awareness of the chemistry students in their fourth year studying at Ibn Al-Haytham Education College of pure sciences at the University of Baghdad. The research sample consisted of (107) male and female students out of the total number of (153) students studying during the (2017-2018) academic year, The sample therefore represents 71% of the total students. The research methodology used consisted of two parts. The first part is concerned with measuring information awareness using a multiple choice type of test related to (40) issues. The students were required to select the between (5) alternative answers for each issue. The objectives of the test and the issues used are to measure the
... Show More