This study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods produce an initial description, which is then contextually, and refined using modern models. Preliminary estimates indicate that this approach could reduce the initial computational cost by up to 20% compared to relying entirely on deep models while maintaining high accuracy. The study recommends further research to develop effective coordination mechanisms between traditional and modern methods and to move to the experimental validation phase of the hybrid model in preparation for its application in environments that require a balance between speed and accuracy, such as real-time computer vision applications.
Simple and sensitive kinetic methods are developed for the determination of Paracetamol in pure form and in pharmaceutical preparations. The methods are based on direct reaction (oxidative-coupling reaction) of Paracetamol with o-cresol in the presence of sodium periodate in alkaline medium, to form an intense blue-water-soluble dye that is stable at room temperature, and was followed spectrophotometriclly at λmax= 612 nm. The reaction was studied kinetically by Initial rate and fixed time (at 25 minutes) methods, and the optimization of conditions were fixed. The calibration graphs for drug determination were linear in the concentration ranges (1-7 μg.ml-1) for the initial rate and (1-10 μg.ml-1) for the fixed time methods at 25 min.
... Show MoreExplanation of article events , or discribing it establish to anderstanding an phenomenon which its effects still clear in the art , surching like this may be very usful in the analysis of new art , which considering one of the most important turns in the history of art . and if we look to human body in the art as existenc in the art , from it’s begening to the modern age . so we can understand the meaning of this existence and it’s directins which cover all the worid and the lead us to thiories and suggestion’s help in understand to this direction and the effects between our arts and the external directions.
The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv
... Show MoreComputer-aided diagnosis (CAD) has proved to be an effective and accurate method for diagnostic prediction over the years. This article focuses on the development of an automated CAD system with the intent to perform diagnosis as accurately as possible. Deep learning methods have been able to produce impressive results on medical image datasets. This study employs deep learning methods in conjunction with meta-heuristic algorithms and supervised machine-learning algorithms to perform an accurate diagnosis. Pre-trained convolutional neural networks (CNNs) or auto-encoder are used for feature extraction, whereas feature selection is performed using an ant colony optimization (ACO) algorithm. Ant colony optimization helps to search for the bes
... Show MoreCommunicative-based textbooks are developed and disseminated throughout the country.
However, it is difficult for teachers who themselves have learnt English through the traditional
approaches to suddenly be familiar with CLT (Communicative Language Teaching) principles
and teach communicatively. Therefore, many teachers remain somewhat confused about what
exactly CLT is and others familiar with CLT but unable to achieve communicative classroom
teaching. Consequently, those teachers need to be introduced to the CLT principles and they need
training in how to put CLT principles into practice. Accordingly, this study aims to find out the
effect of combining video lectures and Kolb experiential learning on EFL student-t
In this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreIn this article, we will present a quasi-contraction mapping approach for D iteration, and we will prove that this iteration with modified SP iteration has the same convergence rate. At the other hand, we prove that the D iteration approach for quasi-contraction maps is faster than certain current leading iteration methods such as, Mann and Ishikawa. We are giving a numerical example, too.
In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThe multiple linear regression model is an important regression model that has attracted many researchers in different fields including applied mathematics, business, medicine, and social sciences , Linear regression models involving a large number of independent variables are poorly performing due to large variation and lead to inaccurate conclusions , One of the most important problems in the regression analysis is the multicollinearity Problem, which is considered one of the most important problems that has become known to many researchers , As well as their effects on the multiple linear regression model, In addition to multicollinearity, the problem of outliers in data is one of the difficulties in constructing the reg
... Show MoreThis research investigates the methods of producing Investigative Arabic Television Programs that are able to prove its existence during a short period of time as a form of Television programs on Arab satellite channels growing in number and varied in content. The research aims to present qualitative and quantitative descriptions of the methods used in tackling the topics discussed in the program, and knowing whether they satisfy the conditions and scientific foundations for the research, investigation, analysis, and interpretation. The researcher uses the survey method and uses the tool of content analysis including a set of methodological steps that seek to discover the implied meaning of the research sample represented by the program
... Show More