Registration techniques are still considered challenging tasks to remote sensing users, especially after enormous increase in the volume of remotely sensed data being acquired by an ever-growing number of earth observation sensors. This surge in use mandates the development of accurate and robust registration procedures that can handle these data with varying geometric and radiometric properties. This paper aims to develop the traditional registration scenarios to reduce discrepancies between registered datasets in two dimensions (2D) space for remote sensing images. This is achieved by designing a computer program written in Visual Basic language following two main stages: The first stage is a traditional registration process by defining a set of control point pairs using manual selection, then comput the parameters of global affine transformation model to match them and resample the images. The second stage included matching process refinement by determining the shift value in control points (CPs) location depending on radiometric similarity measure. Then shift map technique was adjusted to adjust the process using 2nd order polynomial transformation function. This function has chosen after conducting statistical analyses, comparing between the common transformation functions (similarity, affine, projection and 2nd order polynomial). The results showed that the developed approach reduced the root mean square error (RMSE) of registration process and decreasing the discrepancies between registered datasets with 60%, 57% and 48% respectively for each one of the three tested datasets.
We propose a novel strategy to optimize the test suite required for testing both hardware and software in a production line. Here, the strategy is based on two processes: Quality Signing Process and Quality Verification Process, respectively. Unlike earlier work, the proposed strategy is based on integration of black box and white box techniques in order to derive an optimum test suite during the Quality Signing Process. In this case, the generated optimal test suite significantly improves the Quality Verification Process. Considering both processes, the novelty of the proposed strategy is the fact that the optimization and reduction of test suite is performed by selecting only mutant killing test cases from cumulating t-way test ca
... Show MoreRA Ali, LK Abood, Int J Sci Res, 2017 - Cited by 2
This research including lineament automated extraction by using PCI Geomatica program, depending on satellite image and lineament analysis by using GIS program. Analysis included density analysis, length density analysis and intersection density analysis. When calculate the slope map for the study area, found the relationship between the slope and lineament density.
The lineament density increases in the regions that have high values for the slope, show that lineament play an important role in the classification process as it isolates the class for the other were observed in Iranian territory, clearly, also show that one of the lineament hit shoulders of Galal Badra dam and the surrounding areas dam. So should take into consideration
This study aims to estimate the accuracy of digital elevation models (DEM) which are created with exploitation of open source Google Earth data and comparing with the widely available DEM datasets, Shuttle Radar Topography Mission (SRTM), version 3, and Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), version 2. The GPS technique is used in this study to produce digital elevation raster with a high level of accuracy, as reference raster, compared to the DEM datasets. Baghdad University, Al Jadriya campus, is selected as a study area. Besides, 151 reference points were created within the study area to evaluate the results based on the values of RMS.Furthermore, th
... Show MoreWith the wide developments of computer applications and networks, the security of information has high attention in our common fields of life. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is widely accepted due to its strong encryption, and complex processing as well as its resistance to brute force attack. The proposed modifications are implemented by encryption and decryption Rijndael
... Show MoreWith the wide developments of computer science and applications of networks, the security of information must be increased and make it more complex. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is important because of its strong encryption. The proposed updates are represented by encryption and decryption Rijndael S-Box using NTRU algorithm. These modifications enhance the degree of
... Show MoreHigh peak to average power ration (PAPR) in orthogonal frequency division multiplexing (OFDM) is an important problem, which increase the cost and complexity of high power amplifiers. One of the techniques used to reduce the PAPR in OFDM system is the tone reservation method (TR). In our work we propose a modified tone reservation method to decrease the PAPR with low complexity compared with the conventional TR method by process the high and low amplitudes at the same time. An image of size 128×128 is used as a source of data that transmitted using OFDM system. The proposed method decrease the PAPR by 2dB compared with conventional method with keeping the performance unchanged. The performance of the proposed method is tested with
... Show MoreAmong the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show More