Ground Penetrating Radar (GPR) is a nondestructive geophysical technique that uses electromagnetic waves to evaluate subsurface information. A GPR unit emits a short pulse of electromagnetic energy and is able to determine the presence or absence of a target by examining the reflected energy from that pulse. GPR is geophysical approach that use band of the radio spectrum. In this research the function of GPR has been summarized as survey different buried objects such as (Iron, Plastic(PVC), Aluminum) in specified depth about (0.5m) using antenna of 250 MHZ, the response of the each object can be recognized as its shapes, this recognition have been performed using image processing such as filtering. Where different filters like (DC adjustment, triangular FIR, delete mean trace, FIR) have been applied on output image as well as the simulation of the soil and the buried objects layers have been obtained using GPR simulation program. |
The efficient sequencing techniques have significantly increased the number of genomes that are now available, including the Crenarchaeon Sulfolobus solfataricus P2 genome. The genome-scale metabolic pathways in Sulfolobus solfataricus P2 were predicted by implementing the “Pathway Tools†software using MetaCyc database as reference knowledge base. A Pathway/Genome Data Base (PGDB) specific for Sulfolobus solfataricus P2 was created. A curation approach was carried out regarding all the amino acids biosynthetic pathways. Experimental literatures as well as homology-, orthology- and context-based protein function prediction methods were followed for the curation process. The “PathoLogicâ€
... Show MoreThe present paper focuses on the study of some characteristics of
comets ions by photometry method which represent by CCD camera
which it provide seeing these images in a graded light. From 0-255
when Zero (low a light intensity) and 255 (highlight intensity). These
differences of photonic intensity can be giving us a curve which
appear from any line of this image.
From these equations the focus is concentrating on determine the
temperature distribution, velocity distribution, and intensity number
distribution which is give number of particles per unit volume.
The results explained the interaction near the cometary nucleus
which is mainly affected by the new ions added to the density of the
solar wind, th
Submerged arc welding (SAW) process is an essential metal joining processes in industry. The quality of weld is a very important working aspect for the manufacturing and construction industries, the challenges are made optimal process environment. Design of experimental using Taguchi method (L9 orthogonal array (OA)) considering three SAW parameter are (welding current, arc voltage and welding speed) and three levels (300-350-400 Amp. , 32-36-40 V and 26-28-30 cm/min). The study was done on SAW process parameters on the mechanical properties of steel type comply with (ASTM A516 grade 70). Signal to Noise ratio (S/N) was computed to calculate the optimal process parameters. Percentage contributions of each parameter are validated by using an
... Show MoreThe primary objective of this paper is to improve a biometric authentication and classification model using the ear as a distinct part of the face since it is unchanged with time and unaffected by facial expressions. The proposed model is a new scenario for enhancing ear recognition accuracy via modifying the AdaBoost algorithm to optimize adaptive learning. To overcome the limitation of image illumination, occlusion, and problems of image registration, the Scale-invariant feature transform technique was used to extract features. Various consecutive phases were used to improve classification accuracy. These phases are image acquisition, preprocessing, filtering, smoothing, and feature extraction. To assess the proposed
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreThis work was conducted to study the extraction of eucalyptus oil from natural plants (Eucalyptus camaldulensis leaves) using water distillation method by Clevenger apparatus. The effects of main operating parameters were studied: time to reach equilibrium, temperature (70 to100°C), solvent to solid ratio (4:1 to 8:1 (v/w)), agitation speed (0 to 900 rpm), and particle size (0.5 to 2.5 cm) of the fresh leaves, to find the best processing conditions for achieving maximum oil yield. The results showed that the agitation speed of 900 rpm, temperature 100° C, with solvent to solid ratio 5:1 (v/w) of particle size 0.5 cm for 160 minute give the highest percentage of oil (46.25 wt.%). The extracted oil was examined by HPLC.
The Aim of this paper is to investigate numerically the simulation of ice melting in one and two dimension using the cell-centered finite volume method. The mathematical model is based on the heat conduction equation associated with a fixed grid, latent heat source approach. The fully implicit time scheme is selected to represent the time discretization. The ice conductivity is chosen
to be the value of the approximated conductivity at the interface between adjacent ice and water control volumes. The predicted temperature distribution, percentage melt fraction, interface location and its velocity is compared with those obtained from the exact analytical solution. A good agreement is obtained when comparing the numerical results of one
While analytical solutions to Quadratic Assignment Problems (QAP) have indeed been since a long time, the expanding use of Evolutionary Algorithms (EAs) for similar issues gives a framework for dealing with QAP with an extraordinarily broad scope. The study's key contribution is that it normalizes all of the criteria into a single scale, regardless of their measurement systems or the requirements of minimum or maximum, relieving the researchers of the exhaustively quantifying the quality criteria. A tabu search algorithm for quadratic assignment problems (TSQAP) is proposed, which combines the limitations of tabu search with a discrete assignment problem. The effectiveness of the proposed technique has been compared to well-established a
... Show More