The vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion sc
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreThe deployment of UAVs is one of the key challenges in UAV-based communications while using UAVs for IoT applications. In this article, a new scheme for energy efficient data collection with a deadline time for the Internet of things (IoT) using the Unmanned Aerial Vehicles (UAV) is presented. We provided a new data collection method, which was set to collect IoT node data by providing an efficient deployment and mobility of multiple UAV, used to collect data from ground internet of things devices in a given deadline time. In the proposed method, data collection was done with minimum energy consumption of IoTs as well as UAVs. In order to find an optimal solution to this problem, we will first provide a mixed integer linear programming m
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreThe transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreAssociation rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.
Bootstrap is one of an important re-sampling technique which has given the attention of researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con
... Show MoreThe aim of this research is to construct a three-dimensional maritime transport model to transport nonhomogeneous goods (k) and different transport modes (v) from their sources (i) to their destinations (j), while limiting the optimum quantities v ijk x to be transported at the lowest possible cost v ijk c and time v ijk t using the heuristic algorithm, Transport problems have been widely studied in computer science and process research and are one of the main problems of transport problems that are usually used to reduce the cost or times of transport of goods with a number of sources and a number of destinations and by means of transport to meet the conditions of supply and demand. Transport models are a key tool in logistics an
... Show Moreteen sites Baghdad are made. The sites are divided into two groups, one in Karkh and the other in Rusafa. Assessing the underground conditions can be occurred by drilling vertical holes called exploratory boring into the ground, obtaining soil (disturbed and undisturbed) samples, and testing these samples in a laboratory (civil engineering laboratory /University of Baghdad). From disturbed, the tests involved the grain size analysis and then classified the soil, Atterberg limit, chemical test (organic content, sulphate content, gypsum content and chloride content). From undisturbed samples, the test involved the consolidation test (from this test, the following parameters can be obtained: initial void ratio eo, compression index cc, swel
... Show More