This paper deals with, Bayesian estimation of the parameters of Gamma distribution under Generalized Weighted loss function, based on Gamma and Exponential priors for the shape and scale parameters, respectively. Moment, Maximum likelihood estimators and Lindley’s approximation have been used effectively in Bayesian estimation. Based on Monte Carlo simulation method, those estimators are compared in terms of the mean squared errors (MSE’s).
In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreTo ascertain the stability or instability of time series, three versions of the model proposed by Dickie-Voller were used in this paper. The aim of this study is to explain the extent of the impact of some economic variables such as the supply of money, gross domestic product, national income, after reaching the stability of these variables. The results show that the variable money supply, the GDP variable, and the exchange rate variable were all stable at the level of the first difference in the time series. This means that the series is an integrated first-class series. Hence, the gross fixed capital formation variable, the variable national income, and the variable interest rate
... Show MoreIn this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show MoreSince the introduction of the HTTP/3, research has focused on evaluating its influences on the existing adaptive streaming over HTTP (HAS). Among these research, due to irrelevant transport protocols, the cross-protocol unfairness between the HAS over HTTP/3 (HAS/3) and HAS over HTTP/2 (HAS/2) has caught considerable attention. It has been found that the HAS/3 clients tend to request higher bitrates than the HAS/2 clients because the transport QUIC obtains higher bandwidth for its HAS/3 clients than the TCP for its HAS/2 clients. As the problem originates from the transport layer, it is likely that the server-based unfairness solutions can help the clients overcome such a problem. Therefore, in this paper, an experimental study of the se
... Show MoreElectromyogram (EMG)-based Pattern Recognition (PR) systems for upper-limb prosthesis control provide promising ways to enable an intuitive control of the prostheses with multiple degrees of freedom and fast reaction times. However, the lack of robustness of the PR systems may limit their usability. In this paper, a novel adaptive time windowing framework is proposed to enhance the performance of the PR systems by focusing on their windowing and classification steps. The proposed framework estimates the output probabilities of each class and outputs a movement only if a decision with a probability above a certain threshold is achieved. Otherwise (i.e., all probability values are below the threshold), the window size of the EMG signa
... Show MoreIn this article, the casting method was used to prepare poly(methyl methacrylate)/hydroxyapatite (PMMA/HA) nanocomposite films incorporated with different contents (0.5, 1, and 1.5 wt%) of graphene nanoplatelets (Gnp). The chemical properties and surface morphology of the PMMA/HA blend and PMMA/HA/Gnp nanocomposite were characterized using FTIR, and SEM analysis. Besides, the thermal conductivity, dielectric and electrical properties at (1–107 Hz) of the PMMA/HA blend and PMMA/HA/Gnp composites were investigated. The structural analysis showed that the synthesized composites had a low agglomerated state, with multiple wrinkles of graphene flakes in the PMMA/HA blend. The thermal conductivity was improved by more than 35-fold its value for
... Show MoreIn this study, we introduce new a nanocomposite of functionalize graphene oxide FGO and functionalize multi wall carbon nanotube (F-MWCNT-FGO).The formation of nanocomposite was confirmed by FT-IR ,XRD and SEM. The magnitude of the dielectric permittivity of the (F-MWCNT-FGO) nanocomposite appears to be very high in the low frequency range and show a unique negative permittivity at frequencies range from 400 Hz to 4000Hz. The ac conductivity of nanocomposite reaches 23.8 S.m-1 at 100Hz.
Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show More