General Directorate of Surveying is considered one of the most important sources of maps in Iraq. It produced digital maps for whole Iraq in the last six years. These maps are produced from different data sources with unknown accuracy; therefore, the quality of these maps needs to be assessed. The main aim of this study is to evaluate the positional accuracy of digital maps that produced from General Directorate of Surveying. Two different study areas were selected: AL-Rusafa and AL-Karkh in Baghdad / Iraq with an area of 172.826 and 135.106 square kilometers, respectively. Different statistical analyses were conducted to calculate the elements of positional accuracy assessment (mean µ, root mean square error RMSE, minimum and maximum errors). According to the obtained results, it can be stated that the maps of the General Directorate of Surveying can be used in reconnaissance or in works that require low or specified positional accuracy (eg. ±5m), and it cannot be used for applications need high accuracy (e.g. precise surveying).
The Quality function deployment (QFD) tool is an important tool of total quality management because its a link between two important parts customer and production process of the product, using advanced House of quality, which contributed to provide more details about improving the product before it had a vision for the future of the product be improved. Also the identification of the two competitors (Alwazeer , Altouri) bases on the survey of retailers which they identified five competitors products (Alwazeer , Altouri , Ferry , Jif , Dina)for the product (Zahi). Then House of quality to product (Zahi) has been developed By using a Kano Model to classify of customer's requirements for the
... Show MoreData generated from modern applications and the internet in healthcare is extensive and rapidly expanding. Therefore, one of the significant success factors for any application is understanding and extracting meaningful information using digital analytics tools. These tools will positively impact the application's performance and handle the challenges that can be faced to create highly consistent, logical, and information-rich summaries. This paper contains three main objectives: First, it provides several analytics methodologies that help to analyze datasets and extract useful information from them as preprocessing steps in any classification model to determine the dataset characteristics. Also, this paper provides a comparative st
... Show MoreThe Audit evedances represent the reconciliation tools between the Financial data shown on financial statements, and the level of satisfaction level of the Auditor about these statements. According that, the Auditor try to achieve the highest quantity of These evidances, and the most satisfactive of it…, but that will be so hard sometimes, when the internal controlling system is not good, and when the Auditor had some satisfied evidences, but not sharp… So, this research comes to inspect the relation between the quantity, and the level of satisfaction, and argument to prove that evidences gives. This research assumes that getting enough evidences leads to reduce faults, improves the auditing operation, and avoids risks. The research
... Show MoreThe research aims to apply a modified SERVQUAL model to evaluate the quality of the educational services via conducting exploratory research for students from the College of Administration and Economics- Department of Business Administration- Evening studies at the University of Baghdad. Questionnaire of two parts was distributed to a sample of (72) students out of (720) students of the 2nd.,3rd. and 4th. year in the beginning of the second semester of the year 2008-2009 to measure the expectations and perceptions to the quality of the educational services. Five major dimensions were analyzed to see the gaps for (22) variables. The study concluded that there were (13) variables confirmed that the
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreComputer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the bes
... Show MoreThe accumulation of sediment in reservoirs poses a major challenge that impacts the storage capacity, quality of water, and efficiency of hydroelectric power generation systems. Geospatial methods, including Geographic Information Systems (GIS) and Remote Sensing (RS), were used to assess Dukan Reservoir sediment quantities. Satellite and reservoir water level data from 2010 to 2022 were used for sedimentation assessment. The satellite data was used to analyze the water spread area, employing the Normalized Difference Water Index (NDWI) and Modified Normalized Difference Water Index (MNDWI) to enhance the water surface in the satellite imagery of Dukan Reservoir. The cone formula was employed to calculate the live storag
... Show More