The issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of the previous stage. Improvements include the use of a new activation function, regular parameter tuning, and an improved learning rate in the later stages of training. The experimental results on the flickr8k dataset showed a noticeable and satisfactory improvement in the second stage, where a clear increment was achieved in the evaluation metrics Bleu1-4, Meteor, and Rouge-L. This increment confirmed the effectiveness of the alterations and highlighted the importance of hyper-parameter tuning in improving the performance of CNN-LSTM models in image caption tasks.
The research material was prune plums (
The 3-parameter Weibull distribution is used as a model for failure since this distribution is proper when the failure rate somewhat high in starting operation and these rates will be decreased with increasing time .
In practical side a comparison was made between (Shrinkage and Maximum likelihood) Estimators for parameter and reliability function using simulation , we conclude that the Shrinkage estimators for parameters are better than maximum likelihood estimators but the maximum likelihood estimator for reliability function is the better using statistical measures (MAPE)and (MSE) and for different sample sizes.
Note:- ns : small sample ; nm=median sample
... Show MoreAbstract
Bivariate time series modeling and forecasting have become a promising field of applied studies in recent times. For this purpose, the Linear Autoregressive Moving Average with exogenous variable ARMAX model is the most widely used technique over the past few years in modeling and forecasting this type of data. The most important assumptions of this model are linearity and homogenous for random error variance of the appropriate model. In practice, these two assumptions are often violated, so the Generalized Autoregressive Conditional Heteroscedasticity (ARCH) and (GARCH) with exogenous varia
... Show MoreMany people take protein supplements in an effort to gain muscle. However, there is some controversy as to whether this is really effective. There is evidence suggesting that consuming high level s of protein may in fact have negative side effects for health. The current study included 29 young Iraqi building muscles in two different groups (taken and not protein supplements) (age range=17-31 years), the cases were selected from family, friends, college students, and Gyms), from November 2014 to March 2015. A careful history was obtained from each volunteer including age, duration of sports, type of supplements, and family history of diseases. Some biochemical parameters like (glucose, urea, uric acid, creatinine, bilirubin, serum protei
... Show MoreWith the growth of mobile phones, short message service (SMS) became an essential text communication service. However, the low cost and ease use of SMS led to an increase in SMS Spam. In this paper, the characteristics of SMS spam has studied and a set of features has introduced to get rid of SMS spam. In addition, the problem of SMS spam detection was addressed as a clustering analysis that requires a metaheuristic algorithm to find the clustering structures. Three differential evolution variants viz DE/rand/1, jDE/rand/1, jDE/best/1, are adopted for solving the SMS spam problem. Experimental results illustrate that the jDE/best/1 produces best results over other variants in terms of accuracy, false-positive rate and false-negative
... Show MoreThe necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embe
... Show MoreEnd of the twentieth century witnessed by the technological evolution Convergences between the visual arts aesthetic value and objective representation of the image in the composition of the design of the fabric of new insights and unconventional potential in atypical employment. It is through access to the designs of modern fabrics that address the employment picture footage included several scenes footage from the film, which focuses on research and analytical as a study to demonstrate the elements of the picture and the organization of its rules and how to functioning in the design of fabrics, Thus, it has identified the problem by asking the following: What are the elements of the picture footage and how the functioning of the struct
... Show MoreThe university course timetable problem (UCTP) is typically a combinatorial optimization problem. Manually achieving a useful timetable requires many days of effort, and the results are still unsatisfactory. unsatisfactory. Various states of art methods (heuristic, meta-heuristic) are used to satisfactorily solve UCTP. However, these approaches typically represent the instance-specific solutions. The hyper-heuristic framework adequately addresses this complex problem. This research proposed Particle Swarm Optimizer-based Hyper Heuristic (HH PSO) to solve UCTP efficiently. PSO is used as a higher-level method that selects low-level heuristics (LLH) sequence which further generates an optimal solution. The proposed a
... Show MoreThis paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreOptimizing the Access Point (AP) deployment is of great importance in wireless applications owing the requirement to provide efficient and cost-effective communication. Highly targeted by many researchers and academic industries, Quality of Service (QOS) is an important primary parameter and objective in mind along with AP placement and overall publishing cost. This study proposes and investigates a multi-level optimization algorithm based on Binary Particle Swarm Optimization (BPSO). It aims to an optimal multi-floor AP placement with effective coverage that makes it more capable of supporting QOS and cost effectiveness. Five pairs (coverage, AP placement) of weights, signal threshol