DNA computing widely used in encryption or hiding the data. Many researchers have proposed many developments of encryption and hiding algorithms based on DNA sequence to provide new algorithms. In this paper data hiding using integer lifting wavelet transform based on DNA computing is presented. The transform is applied on blue channel of the cover image. The DNA encoding used to encode the two most significant bits of LL sub-band. The produced DNA sequence used for two purpose, firstly, it use to construct the key for encryption the secret data and secondly to select the pixels in HL, LH, HH sub-bands for hiding in them. Many measurement parameters used to evaluate the performance of the proposed method such PSNR, MSE, and SSIM. The experimental results show high performance with respect to different embedding rate.
Linear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreObjective: This study goal was to screen participants from different settings in Baghdad for depression using Beck Depression Inventory (BDI) scale and identify factors influencing the levels of depression. Methods: This cross-sectional study included a convenience sample of 313 people from four settings (teaching hospital, college of medicine, college of pharmacy, and high school) in Baghdad, Iraq. The participants were screened using paper survey relying on the BDI scale during spring 2018. Using multiple linear regression analysis, we measured the association between depression scores and six participant factors. Results: The overall prevalence of depression in our sample was 57.2%. Female participants had higher BDI
... Show MorePolyaniline organic Semiconductor polymer thin films have been prepared by oxidative polymerization at room temperature, this polymer was deposited on glass substrate with thickness 900nm, FTIR spectra was tested , the structural,optical and electrical properties were studied through XRD ,UV-Vis ,IR measurements ,the results was appeared that polymer thin film sensing to NH3 gas.
In this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreAn experimental study on a KIA pride (SAIPA 131) car model with scale of 1:14 in the wind tunnel was made beside the real car tests. Some of the modifications to passive flow control which are (vortex generator, spoiler and slice diffuser) were added to the car to reduce the drag force which its undesirable characteristic that increase fuel consumption and exhaust toxic gases. Two types of calculations were used to determine the drag force acting on the car body. Firstly, is by the integrating the values of pressure recorded along the pressure taps (for the wind tunnel and the real car testing), secondly, is by using one component balance device (wind tunnel testing) to measure the force. The results show that, the average drag estimated on
... Show MoreIn this paper, we have been used the Hermite interpolation method to solve second order regular boundary value problems for singular ordinary differential equations. The suggest method applied after divided the domain into many subdomains then used Hermite interpolation on each subdomain, the solution of the equation is equal to summation of the solution in each subdomain. Finally, we gave many examples to illustrate the suggested method and its efficiency.
Submerged arc welding (SAW) process is an essential metal joining processes in industry. The quality of weld is a very important working aspect for the manufacturing and construction industries, the challenges are made optimal process environment. Design of experimental using Taguchi method (L9 orthogonal array (OA)) considering three SAW parameter are (welding current, arc voltage and welding speed) and three levels (300-350-400 Amp. , 32-36-40 V and 26-28-30 cm/min). The study was done on SAW process parameters on the mechanical properties of steel type comply with (ASTM A516 grade 70). Signal to Noise ratio (S/N) was computed to calculate the optimal process parameters. Percentage contributions of each parameter are validated by using an
... Show MoreIn this paper, an enhanced artificial potential field (EAPF) planner is introduced. This planner is proposed to rapidly find online solutions for the mobile robot path planning problems, when the underlying environment contains obstacles with unknown locations and sizes. The classical artificial potential field represents both the repulsive force due to the detected obstacle and the attractive force due to the target. These forces can be considered as the primary directional indicator for the mobile robot. However, the classical artificial potential field has many drawbacks. So, we suggest two secondary forces which are called the midpoint
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show More