Human Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that continuous efforts are required to improve the development of HIPs base text. In this paper, a new proposed scheme is designed for animated text-based HIP; this scheme exploits the gap between the usual perception of human and the ability of computer to mimic this perception and to achieve more secured and more human usable HIP. This scheme could prevent attacks since it's hard for the machine to distinguish characters with animation environment displayed by digital video, but it's certainly still easy and practical to be used by humans because humans are attuned to perceiving motion easily. The proposed scheme has been tested by many Optical Character Recognition applications, and it overtakes all these tests successfully and it achieves a high usability rate of 95%.
Partial shading is one of the problems that affects the power production and the efficiency of photovoltaic module. A series of experimental work have been done of partial shading of monocrystalline PV module; 50W, Isc: 3.1A, Voc: 22V with 36 cells in series is achieved. Non-linear power output responses of the module are observed by applying various cases of partial shading (vertical and horizontal shading of solar cells in the module). Shading a single cell (corner cell) has the greatest impact on output energy. Horizontal shading or vertical shading reduced the power from 41W to 18W at constant solar radiation 1000W/m2 and steady state condition. Vertical blocking a column
... Show MoreThe Internet image retrieval is an interesting task that needs efforts from image processing and relationship structure analysis. In this paper, has been proposed compressed method when you need to send more than a photo via the internet based on image retrieval. First, face detection is implemented based on local binary patterns. The background is notice based on matching global self-similarities and compared it with the rest of the image backgrounds. The propose algorithm are link the gap between the present image indexing technology, developed in the pixel domain, and the fact that an increasing number of images stored on the computer are previously compressed by JPEG at the source. The similar images are found and send a few images inst
... Show MoreRoughness length is one of the key variables in micrometeorological studies and environmental studies in regards to describing development of cities and urban environments. By utilizing the three dimensions ultrasonic anemometer installed at Mustansiriyah university, we determined the rate of the height of the rough elements (trees, buildings and bridges) to the surrounding area of the university for a radius of 1 km. After this, we calculated the zero-plane displacement length of eight sections and calculated the length of surface roughness. The results proved that the ranges of the variables above are ZH (9.2-13.8) m, Zd (4.3-8.1) m and Zo (0.24-0.48) m.
Electrical Discharge Machining (EDM) is a non-traditional cutting technique for metals removing which is relied upon the basic fact that negligible tool force is produced during the machining process. Also, electrical discharge machining is used in manufacturing very hard materials that are electrically conductive. Regarding the electrical discharge machining procedure, the most significant factor of the cutting parameter is the surface roughness (Ra). Conventional try and error method is time consuming as well as high cost. The purpose of the present research is to develop a mathematical model using response graph modeling (RGM). The impact of various parameters such as (current, pulsation on time and pulsation off time) are studied on
... Show MoreBioethanol produced from lignocellulose feedstock is a renewable substitute to declining fossil fuels. Pretreatment using ultrasound assisted alkaline was investigated to enhance the enzyme digestibility of waste paper. The pretreatment was conducted over a wide range of conditions including waste paper concentrations of 1-5%, reaction time of 10-30 min and temperatures of 30-70°C. The optimum conditions were 4 % substrate loading with 25 min treatment time at 60°C where maximum reducing sugar obtained was 1.89 g/L. Hydrolysis process was conducted with a crude cellulolytic enzymes produced by Cellulomonas uda (PTCC 1259).The maximum amount of sugar released and hydrolysis efficiency were 20.92 g/L and 78.4 %, respectively. Sugars
... Show More This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreThe statistical distributions study aimed to obtain on best descriptions of variable sets phenomena, which each of them got one behavior of that distributions . The estimation operations study for that distributions considered of important things which could n't canceled in variable behavior study, as result this research came as trial for reaching to best method for information distribution estimation which is generalized linear failure rate distribution, throughout studying the theoretical sides by depending on statistical posteriori methods like greatest ability, minimum squares method and Mixing method (suggested method).
The research
... Show MoreThe calculation of the oil density is more complex due to a wide range of pressuresand temperatures, which are always determined by specific conditions, pressure andtemperature. Therefore, the calculations that depend on oil components are moreaccurate and easier in finding such kind of requirements. The analyses of twenty liveoil samples are utilized. The three parameters Peng Robinson equation of state istuned to get match between measured and calculated oil viscosity. The Lohrenz-Bray-Clark (LBC) viscosity calculation technique is adopted to calculate the viscosity of oilfrom the given composition, pressure and temperature for 20 samples. The tunedequation of state is used to generate oil viscosity values for a range of temperatu
... Show More