Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregation process. In this paper, we have highlighted the gains of the existing schemes for node clustering based data aggregation along with a detailed discussion on their advantages and issues that may degrade the performance. Also, the boundary issues in each type of clustering technique have been analyzed. Simulation results reveal that the efficacy and validity of these clustering-based data aggregation algorithms are limited to specific sensing situations only, while failing to exhibit adaptive behavior in various other environmental conditions.
The aim of this work is to develop an axi-symmetric two dimensional model based on a coupled simplified computational fluid dynamics (CFD) and Lagrangian method to predict the air flow patterns and drying of particles. Then using this predictive tool to design more efficient spray dryers. The approach to this is to model what particles experience in the drying chamber with respect to air temperature and humidity. These histories can be obtained by combining the particles trajectories with the air temperature/humidity pattern in the spray dryer. Results are presented and discussed in terms of the air velocity, temperature, and humidity profiles within the chambers and compared for drying of a 42.5% solids solution in a spray chamber
... Show MoreGumbel distribution was dealt with great care by researchers and statisticians. There are traditional methods to estimate two parameters of Gumbel distribution known as Maximum Likelihood, the Method of Moments and recently the method of re-sampling called (Jackknife). However, these methods suffer from some mathematical difficulties in solving them analytically. Accordingly, there are other non-traditional methods, like the principle of the nearest neighbors, used in computer science especially, artificial intelligence algorithms, including the genetic algorithm, the artificial neural network algorithm, and others that may to be classified as meta-heuristic methods. Moreover, this principle of nearest neighbors has useful statistical featu
... Show MoreThe extracting of personal sprite from the whole image faced many problems in separating the sprite edge from the unneeded parts, some image software try to automate this process, but usually they couldn't find the edge or have false result. In this paper, the authors have made an enhancement on the use of Canny edge detection to locate the sprite from the whole image by adding some enhancement steps by using MATLAB. Moreover, remove all the non-relevant information from the image by selecting only the sprite and place it in a transparent background. The results of comparing the Canny edge detection with the proposed method shows improvement in the edge detection.
A new approach for baud time (or baud rate) estimation of a random binary signal is presented. This approach utilizes the spectrum of the signal after nonlinear processing in a way that the estimation error can be reduced by simply increasing the number of the processed samples instead of increasing the sampling rate. The spectrum of the new signal is shown to give an accurate estimate about the baud time when there is no apriory information or any restricting preassumptions. The performance of the estimator for random binary square waves perturbed by white Gaussian noise and ISI is evaluated and compared with that of the conventional estimator of the zero crossing detector.
In this paper, we introduce three robust fuzzy estimators of a location parameter based on Buckley’s approach, in the presence of outliers. These estimates were compared using the variance of fuzzy numbers criterion, all these estimates were best of Buckley’s estimate. of these, the fuzzy median was the best in the case of small and medium sample size, and in large sample size, the fuzzy trimmed mean was the best.
Electrocoagulation is an electrochemical process of treating polluted water where sacrificial anode corrodes to produce active coagulant (usually aluminum or iron cations) into solution. Accompanying electrolytic reactions evolve gas (usually as hydrogen bubbles). The present study investigates the removal of phenol from water by this method. A glass tank with 1 liter volume and two electrodes were used to perform the experiments. The electrode connected to a D.C. power supply. The effect of various factors on the removal of phenol (initial phenol concentration, electrode size, electrodes gab, current density, pH and treatment time) were studied. The results indicated that the removal efficiency decreased as initial phenol concentration
... Show MoreResearch was: 1- known as self-efficacy when students perceived the university. 2- know the significance of statistical differences in perceived self-efficacy according to gender and specialty. Formed the research sample of (300) students were chosen from the original research community by way of random (150) male specialization and scientific and humanitarian (150) females specialized scientific and humanitarian. The search tool to prepare the yard tool to measure perceived self-efficacy based on measurements and previous literature on the subject of perceived self-efficacy. The researcher using a number of means, statistical, including test Altaúa and analysis of variance of bilateral and results showed the enjoyment of the research s
... Show More