In this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has been used simulation procedure for comparison and different sample sizes of size (14,30,60 and 100) using standard comparison Integral Mean Square Error (IMSE). For k-out of-n system, the results indicate that it is better to use Bayesian method for samples of size (30,60 and 100), and to use the classical method for samples of size (14), whereas for series system the best method to use is Bayesian method for samples of size (14,60 and 100) , and to use the classical method for sample of size (30). for parallel system, it is better to use Bayesian method for all sample sizes.
In this paper the method of singular value decomposition is used to estimate the ridge parameter of ridge regression estimator which is an alternative to ordinary least squares estimator when the general linear regression model suffer from near multicollinearity.
The temperature distributions are to be evaluated for the furnace of Al-Mussaib power plant. Monte Carlo simulation procedure is used to evaluate the radiation heat transfer inside the furnace, where the radiative transfer is the most important process occurring there. Weighted sum of gray-gases model is used to evaluate the radiative properties of the non gray gas in the enclosure. The energy balance equations are applied for each gas, and surface zones, and by solving these equations, both the temperature, and the heat flux are found.
Good degree of accuracy has been obtained, when comparing the results obtained by the simulation with the data of the designing company, and the data obtained by the zonal method. In
... Show MoreIn this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number determines the persistence or extinction of the COVID-19. If , one infected cell will transmit the virus to less than one cell, as a result, the person carrying the Coronavirus will get rid of the disease .If the infected cell will be able to infect all cells that contain ACE receptors. The stochastic model proves that if are sufficiently large then maybe give us ultimate disease extinction although , and this facts also proved by computer simulation.
This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreFerritin is a key organizer of protected deregulation, particularly below risky hyperferritinemia, by straight immune-suppressive and pro-inflammatory things. , We conclude that there is a significant association between levels of ferritin and the harshness of COVID-19. In this paper we introduce a semi- parametric method for prediction by making a combination between NN and regression models. So, two methodologies are adopted, Neural Network (NN) and regression model in design the model; the data were collected from مستشفى دار التمريض الخاص for period 11/7/2021- 23/7/2021, we have 100 person, With COVID 12 Female & 38 Male out of 50, while 26 Female & 24 Male non COVID out of 50. The input variables of the NN m
... Show MoreThe aim of this paper to find Bayes estimator under new loss function assemble between symmetric and asymmetric loss functions, namely, proposed entropy loss function, where this function that merge between entropy loss function and the squared Log error Loss function, which is quite asymmetric in nature. then comparison a the Bayes estimators of exponential distribution under the proposed function, whoever, loss functions ingredient for the proposed function the using a standard mean square error (MSE) and Bias quantity (Mbias), where the generation of the random data using the simulation for estimate exponential distribution parameters different sample sizes (n=10,50,100) and (N=1000), taking initial
... Show MoreThe aim of this research is to use robust technique by trimming, as the analysis of maximum likelihood (ML) often fails in the case of outliers in the studied phenomenon. Where the (MLE) will lose its advantages because of the bad influence caused by the Outliers. In order to address this problem, new statistical methods have been developed so as not to be affected by the outliers. These methods have robustness or resistance. Therefore, maximum trimmed likelihood: (MTL) is a good alternative to achieve more results. Acceptability and analogies, but weights can be used to increase the efficiency of the resulting capacities and to increase the strength of the estimate using the maximum weighted trimmed likelihood (MWTL). In order to perform t
... Show MoreModern French novel has gained a distinctive status in the history of French literature during the first half of the twentieth century. This is due to many factors including the new literary descriptive objective style adopted by novelists like Alain Robbe – Grillet that has long been regarded as the outstanding writer of the nouveau roman, as well as its major spokesman, a representative writer and a leading theoretician of the new novel that has broken the classical rules of the one hero and evolved, through questioning the relationship of man and the world and reevaluating the limits of contemporary fiction , into creating a new form of narrative.
Résumé:
En vue de résu
... Show More