In this paper, we investigate two stress-strength models (Bounded and Series) in systems reliability based on Generalized Inverse Rayleigh distribution. To obtain some estimates of shrinkage estimators, Bayesian methods under informative and non-informative assumptions are used. For comparison of the presented methods, Monte Carlo simulations based on the Mean squared Error criteria are applied.
In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
A non-parametric kernel method with Bootstrap technology was used to estimate the confidence intervals of the system failure function of the log-normal distribution trace data. These are the times of failure of the machines of the spinning department of the weaving company in Wasit Governorate. Estimating the failure function in a parametric way represented by the method of the maximum likelihood estimator (MLE). The comparison between the parametric and non-parametric methods was done by using the average of Squares Error (MES) criterion. It has been noted the efficiency of the nonparametric methods based on Bootstrap compared to the parametric method. It was also noted that the curve estimation is more realistic and appropriate for the re
... Show MoreA roundabout is a highway engineering concept meant to calm traffic, increase safety, reduce stop-and-go travel, reduce accidents and congestion, and decrease traffic delays. It is circular and facilitates one-way traffic flow around a central point. The first part of this study evaluated the principles and methods used to compare the capacity methods of roundabouts with different traffic conditions and geometric configurations. These methods include gap acceptance, empirical, and simulation software methods. Previous studies mentioned in this research used various methods and other new models developed by several researchers. However, this paper's main aim is to compare different roundabout capacity models for acceptabl
... Show MoreIn this work, polyvinylpyrrolidone (PVP), Multi-walled carbon nanotubes (MWCNTs) nanocomposite was prepared and hybrid with Graphene (Gr) by casting method. The morphological and optical properties were investigated. Fourier Transformer-Infrared (FT-IR) indicates the presence of primary distinctive peaks belonging to vibration groups that describe the prepared samples. Scanning Electron Microscopy (SEM) images showed a uniform dispersion of graphene within the PVP-MWCNT nanocomposite. The results of the optical study show decrease in the energy gap with increasing MWCNT and graphene concentration. The absorption coefficient spectra indicate the presence of two absorption peaks at 282 and 287 nm attributed to the π-π* electronic tr
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More