The vast advantages of 3D modelling industry have urged competitors to improve capturing techniques and processing pipelines towards minimizing labour requirements, saving time and reducing project risk. When it comes to digital 3D documentary and conserving projects, laser scanning and photogrammetry are compared to choose between the two. Since both techniques have pros and cons, this paper approaches the potential issues of individual techniques in terms of time, budget, accuracy, density, methodology and ease to use. Terrestrial laser scanner and close-range photogrammetry are tested to document a unique invaluable artefact (Lady of Hatra) located in Iraq for future data fusion scenario. Insight investigations of the factors affecting data processing and modelling in individual comparing techniques are discussed and analysed. Qualitative and quantitative statistical analysis was applied based on multiple criteria, such as level of automation (LOA), accuracy and point cloud integrity towards the adaption of data fusion approaches and co-registering frameworks for optimal deliverables.
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
The study's primary purpose is to explore an appropriate way of monitoring and assessing water depths using the satellite remote sensing technique of the Al Habbaniyah Lake in Iraq. This research studied the experience-conditions (thresholds) of different bands for multi-temporal satellite image data with different satellite image sensors (Landsat 5-TM, and EO1-ALI) for the same region, to recognize regions of water depths. The threshold values are taken that to separate the Al Habbaniyah Lake to the required depths (shallow, deep, and very deep), as a supervised method. A three-dimension feature space plot had used to represent these regions. The relationship of the mean values of the three separated water regions with all TM and A
... Show MoreBackground: Chronic cigarette smoking is one of the major risk factors for coronary artery disease. However, it has additional cardiac adverse effects independent of coronary atherosclerosis. Patient and Methods: After informed consent and perm- ission from the review board of the hospital, 80 healthy subjects who were classified as smokers or non-smokers were included in the study. They were examined by standard echocardiography protocol which was followed by two-dimensional speckle tracking to assess the functions of the right ventricle. Results: The tricuspid annular plane systolic excursion (TAPSE) was significantly reduced in smokers as compared to non-smokers (P < 0.05). The tricuspid flow peak late diastolic velocity (A wave) was sig
... Show MoreThe purpose of this study is to examine the dimensions of strategic intent (SI; see Appendix 1) according to the Hamel and Prahalad model as a building for the future, relying on today’s knowledge-based and proactive strategic directions of management as long-term and deep-perspective creative directions, objective vision and rational analysis, integrative in work, survival structure and comprehensiveness in perception.
The quantitative approach was used based on research, detection and proof, as data were collected from leader