Image Fusion is being used to gather important data from such an input image array and to place it in a single output picture to make it much more meaningful & usable than either of the input images. Image fusion boosts the quality and application of data. The accuracy of the image that has fused depending on the application. It is widely used in smart robotics, audio camera fusion, photonics, system control and output, construction and inspection of electronic circuits, complex computer, software diagnostics, also smart line assembling robots. In this paper provides a literature review of different image fusion techniques in the spatial domain and frequency domain, such as averaging, min-max, block substitution, Intensity-Hue-Saturation(IHS), Principal Component Analysis (PCA), pyramid-based techniques, and transforming. Different quality metrics for quantitative analysis of these approaches have been debated.
Merging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreIn this paper, a new tunable approach for fusion the satellite images that fall in different electromagnetic wave ranges is presented, which gives us the ability to make one of the images features little superior on the other without reducing the general resultant image fusion quality, this approach is based on the principal component analysis (PCA) fusion method. A comparison made is between the results of the proposed approach and two fusion methods (they are: the PCA fusion method and the projection of eigenvectors on the bands fusion method), and the comparison results show the validity of this new method.
Signature verification involves vague situations in which a signature could resemble many reference samples or might differ because of handwriting variances. By presenting the features and similarity score of signatures from the matching algorithm as fuzzy sets and capturing the degrees of membership, non-membership, and indeterminacy, a neutrosophic engine can significantly contribute to signature verification by addressing the inherent uncertainties and ambiguities present in signatures. But type-1 neutrosophic logic gives these membership functions fixed values, which could not adequately capture the various degrees of uncertainty in the characteristics of signatures. Type-1 neutrosophic representation is also unable to adjust to various
... Show MoreThe basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreThe maximization of the net present value of the investment in oil field improvements is greatly aided by the optimization of well location, which plays a significant role in the production of oil. However, using of optimization methods in well placement developments is exceedingly difficult since the well placement optimization scenario involves a large number of choice variables, objective functions, and restrictions. In addition, a wide variety of computational approaches, both traditional and unconventional, have been applied in order to maximize the efficiency of well installation operations. This research demonstrates how optimization approaches used in well placement have progressed since the last time they were examined. Fol
... Show MoreIn this research, we highlight the most important research related to the mixed ligand complexes of the drug trimethoprim (TMP), and for the past 7 years where this drug has been used as a chelating ligand and gives stability to the complexes with ions of metal elements where these complexes, prepared and diagnosed, and for some research the bacterial activity was studied against different types of bacteria.
In this research, we highlight the most important research related to the mixed ligand complexes of the drug trimethoprim (TMP), and for the past 7 years where this drug has been used as a chelating ligand and gives stability to the complexes with ions of metal elements where these complexes, prepared and diagnosed, and for some research the bacterial activity was studied against different types of bacteria
Over the past few years, ear biometrics has attracted a lot of attention. It is a trusted biometric for the identification and recognition of humans due to its consistent shape and rich texture variation. The ear presents an attractive solution since it is visible, ear images are easily captured, and the ear structure remains relatively stable over time. In this paper, a comprehensive review of prior research was conducted to establish the efficacy of utilizing ear features for individual identification through the employment of both manually-crafted features and deep-learning approaches. The objective of this model is to present the accuracy rate of person identification systems based on either manually-crafted features such as D
... Show MoreLentic ecosystems are important for fish production and are a critical habitat for waterfowl and numerous migratory birds. In this study we have gathered data on primary productivity of lakes across Iraq to provide updated information to strategize conservation and management. Tigris and Euphrates rivers are the primary sources of filling up major lakes in Iraq the overall assessment shows that the primary productivity is dependent on the algal composition and environmental factors with coincident role of macrophytes. An average of 37 to 637 mg carbon/m3/day of primary productivity was calculated for most of the lakes comprised of Bacillariophyceae and followed by