Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation and some traditional measures of the images before and after the integration process. Results showed that the adopted fusion process and statistical measures have efficiently and qualitatively determined the preference of images after the merge process and indicated which techniques are the best and estimation homogenous regions.
The regions around the world need to perform their results based on the local geoid. However, each region has different ground topography based on the amount of gravity in this region. Nowadays, the recent global Earth's gravity model of 2008 is successfully used for different purposes in geosciences research. This research presents an overview of the preliminary evaluation results of the new Earth Gravitation Model (EGM08) in the middle of Iraq. For completeness, the evaluation tests were also performed for EGM96 by examining 31 stations distributed over four Iraqi provinces. The national orthometric heights were compared with the GPS /leveling data obtained from these stations. This study illustrated that the GPS /leveling based on EGM
... Show MoreAn image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T-
... Show MoreSegmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and geology
... Show MoreTwo unsupervised classifiers for optimum multithreshold are presented; fast Otsu and k-means. The unparametric methods produce an efficient procedure to separate the regions (classes) by select optimum levels, either on the gray levels of image histogram (as Otsu classifier), or on the gray levels of image intensities(as k-mean classifier), which are represent threshold values of the classes. In order to compare between the experimental results of these classifiers, the computation time is recorded and the needed iterations for k-means classifier to converge with optimum classes centers. The variation in the recorded computation time for k-means classifier is discussed.
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreA new proposed technique for secure agent communication is used to transfer data and instructions between agent and server in the local wireless network. The proposed technique depends on the two stages of encryption processing (AES algorithm and proposed Lagrange encryption key generation supported by XOR gate) for packets encryption. The AES key was manipulated by using proposed Lagrange interpolation key generated in order to avoid the weak encryption keys. A good multi encryption operation with a fast encryption time was proposed with a high quality connection operation.
In this paper, the behavior of structural concrete linear bar members was studied using numerical model implemented in a computer program written in MATLAB. The numerical model is based on the modified version of the procedure developed by Oukaili. The model is based on real stress-strain diagrams of concrete and steel and their secant modulus of elasticity at different loading stages. The behavior presented by normal force-axial strain and bending moment-curvature relationships is studied by calculating the secant sectional stiffness of the member. Based on secant methods, this methodology can be easily implemented using an iterative procedure to solve non-linear equations. A compari
In this paper, the behavior of structural concrete linear bar members was studied using numerical model implemented in a computer program written in MATLAB. The numerical model is based on the modified version of the procedure developed by Oukaili. The model is based on real stress-strain diagrams of concrete and steel and their secant modulus of elasticity at different loading stages. The behavior presented by normal force-axial strain and bending moment-curvature relationships is studied by calculating the secant sectional stiffness of the member. Based on secant methods, this methodology can be easily implemented using an iterative procedure to solve non-linear equations. A comparison between numerical and experimental data, illustrated
... Show More