The necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embedding of several stages of complexity. Also, LSB method via using the whole image is to increase the security and robustness of the proposed method as compared to state-of the-art methods.
Data hiding is the process of encoding extra information in an image by making small modification to its pixels. To be practical, the hidden data must be perceptually invisible yet robust to common signal processing operations. This paper introduces a scheme for hiding a signature image that could be as much as 25% of the host image data and hence could be used both in digital watermarking as well as image/data hiding. The proposed algorithm uses orthogonal discrete wavelet transforms with two zero moments and with improved time localization called discrete slantlet transform for both host and signature image. A scaling factor ? in frequency domain control the quality of the watermarked images. Experimental results of signature image
... Show MoreX-ray diffractometers deliver the best quality diffraction data while being easy to use and adaptable to various applications. When X-ray photons strike electrons in materials, the incident photons scatter in a direction different from the incident beam; if the scattered beams do not change in wavelength, this is known as elastic scattering, which causes amplitude and intensity diffraction, leading to constructive interference. When the incident beam gives some of its energy to the electrons, the scattered beam's wavelength differs from the incident beam's wavelength, causing inelastic scattering, which leads to destructive interference and zero-intensity diffraction. In this study, The modified size-strain plot method was used to examin
... Show MoreThe Aim of this paper is to investigate numerically the simulation of ice melting in one and two dimension using the cell-centered finite volume method. The mathematical model is based on the heat conduction equation associated with a fixed grid, latent heat source approach. The fully implicit time scheme is selected to represent the time discretization. The ice conductivity is chosen
to be the value of the approximated conductivity at the interface between adjacent ice and water control volumes. The predicted temperature distribution, percentage melt fraction, interface location and its velocity is compared with those obtained from the exact analytical solution. A good agreement is obtained when comparing the numerical results of one
The primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreThe Assignment model is a mathematical model that aims to express a real problem facing factories and companies which is characterized by the guarantee of its activity in order to make the appropriate decision to get the best allocation of machines or jobs or workers on machines in order to increase efficiency or profits to the highest possible level or reduce costs or time To the extent possible, and in this research has been using the method of labeling to solve the problem of the fuzzy assignment of real data has been approved by the tire factory Diwaniya, where the data included two factors are the factors of efficiency and cost, and was solved manually by a number of iterations until reaching the optimization solution,
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
The main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreWhile analytical solutions to Quadratic Assignment Problems (QAP) have indeed been since a long time, the expanding use of Evolutionary Algorithms (EAs) for similar issues gives a framework for dealing with QAP with an extraordinarily broad scope. The study's key contribution is that it normalizes all of the criteria into a single scale, regardless of their measurement systems or the requirements of minimum or maximum, relieving the researchers of the exhaustively quantifying the quality criteria. A tabu search algorithm for quadratic assignment problems (TSQAP) is proposed, which combines the limitations of tabu search with a discrete assignment problem. The effectiveness of the proposed technique has been compared to well-established a
... Show More