Ground-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holland Scientific Crop Circle Sensor ACS 470 (HSCCACS-470) and 430 (HSCCACS-430). Rainfall data, with or without including crop height, improved the YP models in term of reliability and consistency. The polynomial model was relatively better compared to the exponential model. A significant difference in the relationship between sensor reading multiplied by rainfall data and crop yield was observed in terms of soil type, clay and medium textured, and cultivation system, conventional and no-till, respectively, in the North Dakota corn study. The two potato sites in Maine, irrigated and dryland, performed differently in terms of total yield and rainfall data helped to improve sensor YP models. In conclusion, this study strongly advocates the use of rainfall data while using sensor-based N calculator algorithms.
In this paper, a discussion of the principles of stereoscopy is presented, and the phases
of 3D image production of which is based on the Waterfall model. Also, the results are based
on one of the 3D technology which is Anaglyph and it's known to be of two colors (red and
cyan).
A 3D anaglyph image and visualization technologies will appear as a threedimensional
by using a classes (red/cyan) as considered part of other technologies used and
implemented for production of 3D videos (movies). And by using model to produce a
software to process anaglyph video, comes very important; for that, our proposed work is
implemented an anaglyph in Waterfall model to produced a 3D image which extracted from a
video.
In this paper, an approach for object tracking that is inspired from human oculomotor system is proposed and verified experimentally. The developed approach divided into two phases, fast tracking or saccadic phase and smooth pursuit phase. In the first phase, the field of the view is segmented into four regions that are analogue to retinal periphery in the oculomotor system. When the object of interest is entering these regions, the developed vision system responds by changing the values of the pan and tilt angles to allow the object lies in the fovea area and then the second phase will activate. A fuzzy logic method is implemented in the saccadic phase as an intelligent decision maker to select the values of the pan and tilt angle based
... Show MoreRock mechanical properties are critical parameters for many development techniques related to tight reservoirs, such as hydraulic fracturing design and detecting failure criteria in wellbore instability assessment. When direct measurements of mechanical properties are not available, it is helpful to find sufficient correlations to estimate these parameters. This study summarized experimentally derived correlations for estimating the shear velocity, Young's modulus, Poisson's ratio, and compressive strength. Also, a useful correlation is introduced to convert dynamic elastic properties from log data to static elastic properties. Most of the derived equations in this paper show good fitting to measured data, while some equations show scatters
... Show MoreThe second leading cause of death and one of the most common causes of disability in the world is stroke. Researchers have found that brain–computer interface (BCI) techniques can result in better stroke patient rehabilitation. This study used the proposed motor imagery (MI) framework to analyze the electroencephalogram (EEG) dataset from eight subjects in order to enhance the MI-based BCI systems for stroke patients. The preprocessing portion of the framework comprises the use of conventional filters and the independent component analysis (ICA) denoising approach. Fractal dimension (FD) and Hurst exponent (Hur) were then calculated as complexity features, and Tsallis entropy (TsEn) and dispersion entropy (DispEn) were assessed as
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreDuring of Experimental result of this work , we found that the change of electrical conductivity proprieties of tin dioxide with the change of gas concentration at temperatures 260oC and 360oC after treatment by photons rays have similar character after treatment isothermally. We found that intensive short duration impulse annealing during the fractions of a second leads to crystallization of the films and to the high values of its gas sensitivity.
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show More