n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators
The main idea of this paper is to define other types of a fuzzy local function and study the advantages and differences between them in addition to discussing some definitions of finding new fuzzy topologies. Also in this research, a new type of fuzzy closure has been defined, where the relation between the new type and different types of fuzzy local function has been studied
In this research, the focus was on estimating the parameters on (min- Gumbel distribution), using the maximum likelihood method and the Bayes method. The genetic algorithmmethod was employed in estimating the parameters of the maximum likelihood method as well as the Bayes method. The comparison was made using the mean error squares (MSE), where the best estimator is the one who has the least mean squared error. It was noted that the best estimator was (BLG_GE).
In this paper, the maximum likelihood estimates for parameter ( ) of two parameter's Weibull are studied, as well as white estimators and (Bain & Antle) estimators, also Bayes estimator for scale parameter ( ), the simulation procedures are used to find the estimators and comparing between them using MSE. Also the application is done on the data for 20 patients suffering from a headache disease.
This paper deals with the modeling of a preventive maintenance strategy applied to a single-unit system subject to random failures.
According to this policy, the system is subjected to imperfect periodic preventive maintenance restoring it to ‘as good as new’ with probability
p and leaving it at state ‘as bad as old’ with probability q. Imperfect repairs are performed following failures occurring between consecutive
preventive maintenance actions, i.e the times between failures follow a decreasing quasi-renewal process with parameter a. Considering the
average durations of the preventive and corrective maintenance actions a
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreData centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show More