Preferred Language
Articles
/
5mHUEJkBdMdGkNqjnRIU
Analysis Evolution of Image Caption Techniques: Combining Conventional and Modern Methods for Improvement
...Show More Authors

This study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods produce an initial description, which is then contextually, and refined using modern models. Preliminary estimates indicate that this approach could reduce the initial computational cost by up to 20% compared to relying entirely on deep models while maintaining high accuracy. The study recommends further research to develop effective coordination mechanisms between traditional and modern methods and to move to the experimental validation phase of the hybrid model in preparation for its application in environments that require a balance between speed and accuracy, such as real-time computer vision applications.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Feb 12 2025
Journal Name
Chemchemtech
A REVIEW OF ANALYTICAL METHODS FOR THE ANALYSIS OF PHARMACEUTICALS IN ENVIRONMENTAL SAMPLES
...Show More Authors

One of the main causes for concern is the widespread presence of pharmaceuticals in the environment, which may be harmful to living things. They are often referred to as emerging chemical pollutants in water bodies because they are either still unregulated or undergoing regulation. Pharmaceutical pollution of the environment may have detrimental effects on ecosystem viability, human health, and water quality. In this study, the amount of remaining pharmaceutical compounds in environmental waters was determined using a straightforward review. Pharmaceutical production and consumption have increased due to medical advancements, leading to concerns about their environmental impact and potential harm to living things due to their increa

... Show More
View Publication
Scopus Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
The Use Of Some Parametric And Non parametric Methods For Analysis Of Factorial Experiments With Application
...Show More Authors

summary

In this search, we examined the factorial experiments and the study of the significance of the main effects, the interaction of the factors and their simple effects by the F test (ANOVA) for analyze the data of the factorial experience. It is also known that the analysis of variance requires several assumptions to achieve them, Therefore, in case of violation of one of these conditions we conduct a transform to the data in order to match or achieve the conditions of analysis of variance, but it was noted that these transfers do not produce accurate results, so we resort to tests or non-parametric methods that work as a solution or alternative to the parametric tests , these method

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Oil spill classification based on satellite image using deep learning techniques
...Show More Authors

 An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (3)
Scopus Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Aip Conference Proceedings
Comparative analysis of parameter estimation methods for Meixner process using wavelet packet transform
...Show More Authors

The current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p

... Show More
View Publication
Scopus Crossref
Publication Date
Sun Jun 07 2015
Journal Name
Baghdad Science Journal
A comparative study between conventional methods and Vidas UP Salmonella (SPT) to investigate salmonella species from local and imported meat
...Show More Authors

The study was preformed for investigating of Salmonella from meat, and compared Vidas UP Salmonella (SPT) with the traditional methods of isolation for Salmonella , were examined 42 meat samples (Beef and Chicken) from the Local and Imported From local markets in the city of Baghdad from period December 2013 -February 2014 the samples were cultured on enrichment and differential media and examined samples Vidas, and confirmed of isolates by cultivation chromgenic agar, biochemical tests ,Api20 E systeme , In addition serological tests , and the serotypes determinate in the Central Public Health Laboratory / National Institute of Salmonella The results showed the contamination in imported meat was more than in the local meat 11.9% and 2

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 01 2024
Journal Name
Lecture Notes In Electrical Engineering
A Method Combining Compressive Sensing-Based Method of Moment and LU Decomposition for Solving Monostatic RCS
...Show More Authors

View Publication
Scopus Clarivate Crossref
Publication Date
Sun Nov 19 2017
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
Image Compression based on Fixed Predictor Multiresolution Thresholding of Linear Polynomial Nearlossless Techniques
...Show More Authors

Image compression is a serious issue in computer storage and transmission,  that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the  mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com

... Show More
View Publication
Crossref (1)
Crossref
Publication Date
Mon Apr 03 2023
Journal Name
Journal Of Al-qadisiyah For Computer Science And Mathematics
A General Overview on the Categories of Image Features Extraction Techniques: A Survey
...Show More Authors

In the image processing’s field and computer vision it’s important to represent the image by its information. Image information comes from the image’s features that extracted from it using feature detection/extraction techniques and features description. Features in computer vision define informative data. For human eye its perfect to extract information from raw image, but computer cannot recognize image information. This is why various feature extraction techniques have been presented and progressed rapidly. This paper presents a general overview of the feature extraction categories for image.

View Publication Preview PDF
Crossref
Publication Date
Sat Oct 03 2009
Journal Name
Proceeding Of 3rd Scientific Conference Of The College Of Science
Research Address: New Multispectral Image Classification Methods Based on Scatterplot Technique
...Show More Authors

Publication Date
Tue Feb 02 2016
Journal Name
International Journal Of Computer Science And Mobile Computing
Increasing Security in Steganography by Combining LSB and PRGN
...Show More Authors

With the increasing rate of unauthorized access and attacks, security of confidential data is of utmost importance. While Cryptography only encrypts the data, but as the communication takes place in presence of third parties, so the encrypted text can be decrypted and can easily be destroyed. Steganography, on the other hand, hides the confidential data in some cover source such that the existence of the data is also hidden which do not arouse suspicion regarding the communication taking place between two parties. This paper presents to provide the transfer of secret data embedded into master file (cover-image) to obtain new image (stego-image), which is practically indistinguishable from the original image, so that other than the indeed us

... Show More