Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
In this work, satellite images classification for Al Chabaish marshes and the area surrounding district in (Dhi Qar) province for years 1990,2000 and 2015 using two software programming (MATLAB 7.11 and ERDAS imagine 2014) is presented. Proposed supervised classification method (Modified Vector Quantization) using MATLAB software and supervised classification method (Maximum likelihood Classifier) using ERDAS imagine have been used, in order to get most accurate results and compare these methods. The changes that taken place in year 2000 comparing with 1990 and in year 2015 comparing with 2000 are calculated. The results from classification indicated that water and vegetation are decreased, while barren land, alluvial soil and shallow water
... Show MoreMany of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show MoreIn this paper, we investigate the behavior of the bayes estimators, for the scale parameter of the Gompertz distribution under two different loss functions such as, the squared error loss function, the exponential loss function (proposed), based different double prior distributions represented as erlang with inverse levy prior, erlang with non-informative prior, inverse levy with non-informative prior and erlang with chi-square prior.
The simulation method was fulfilled to obtain the results, including the estimated values and the mean square error (MSE) for the scale parameter of the Gompertz distribution, for different cases for the scale parameter of the Gompertz distr
... Show MoreAbstract
We produced a study in Estimation for Reliability of the Exponential distribution based on the Bayesian approach. These estimates are derived using Bayesian approaches. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .we derived bayes estimators of reliability under four types when the prior distribution for the scale parameter of the Exponential distribution is: Inverse Chi-squar
... Show MoreMaximum likelihood estimation method, uniformly minimum variance unbiased estimation method and minimum mean square error estimation, as classical estimation procedures, are frequently used for parameter estimation in statistics, which assuming the parameter is constant , while Bayes method assuming the parameter is random variable and hence the Bayes estimator is an estimator which minimize the Bayes risk for each value the random observable and for square error lose function the Bayes estimator is the posterior mean. It is well known that the Bayesian estimation is hardly used as a parameter estimation technique due to some difficulties to finding a prior distribution.
The interest of this paper is that
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreTo accommodate utilities in buildings, different sizes of openings are provided in the web of reinforced concrete deep beams, which cause reductions in the beam strength and stiffness. This paper aims to investigate experimentally and numerically the effectiveness of using carbon fiber reinforced polymer (CFRP) strips, as a strengthening technique, to externally strengthen reinforced concrete continuous deep beams (RCCDBs) with large openings. The experimental work included testing three RCCDBs under five-point bending. A reference specimen was prepared without openings to explore the reductions in strength and stiffness after providing large openings. Openings were created symmetrically at the center of spans of the other specimens
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreChange detection is a technology ascertaining the changes of
specific features within a certain time Interval. The use of remotely
sensed image to detect changes in land use and land cover is widely
preferred over other conventional survey techniques because this
method is very efficient for assessing the change or degrading trends
of a region. In this research two remotely sensed image of Baghdad
city gathered by landsat -7and landsat -8 ETM+ for two time period
2000 and 2014 have been used to detect the most important changes.
Registration and rectification the two original images are the first
preprocessing steps was applied in this paper. Change detection using
NDVI subtractive has been computed, subtrac
Prediction of accurate values of residual entropy (SR) is necessary step for the
calculation of the entropy. In this paper, different equations of state were tested for the
available 2791 experimental data points of 20 pure superheated vapor compounds (14
pure nonpolar compounds + 6 pure polar compounds). The Average Absolute
Deviation (AAD) for SR of 2791 experimental data points of the all 20 pure
compounds (nonpolar and polar) when using equations of Lee-Kesler, Peng-
Robinson, Virial truncated to second and to third terms, and Soave-Redlich-Kwong
were 4.0591, 4.5849, 4.9686, 5.0350, and 4.3084 J/mol.K respectively. It was found
from these results that the Lee-Kesler equation was the best (more accurate) one