In this study, we derived the estimation for Reliability of the Exponential distribution based on the Bayesian approach. In the Bayesian approach, the parameter of the Exponential distribution is assumed to be random variable .We derived posterior distribution the parameter of the Exponential distribution under four types priors distributions for the scale parameter of the Exponential distribution is: Inverse Chi-square distribution, Inverted Gamma distribution, improper distribution, Non-informative distribution. And the estimators for Reliability is obtained using the two proposed loss function in this study which is based on the natural logarithm for Reliability function .We used simulation technique, to compare the resultant estimators in terms of their mean squared errors (MSE).Several cases assumed for the parameter of the exponential distribution for data generating of different samples sizes (small, medium, and large). The results were obtained by using simulation technique, Programs written using MATLAB-R2008a program were used. In general, we obtained a good estimations of reliability of the Exponential distribution under the second proposed loss function according to the smallest values of mean squared errors (MSE) for all samples sizes (n) comparative to the estimated values for MSE under the first proposed loss function.
Classifying an overlapping object is one of the main challenges faced by researchers who work in object detection and recognition. Most of the available algorithms that have been developed are only able to classify or recognize objects which are either individually separated from each other or a single object in a scene(s), but not overlapping kitchen utensil objects. In this project, Faster R-CNN and YOLOv5 algorithms were proposed to detect and classify an overlapping object in a kitchen area. The YOLOv5 and Faster R-CNN were applied to overlapping objects where the filter or kernel that are expected to be able to separate the overlapping object in the dedicated layer of applying models. A kitchen utensil benchmark image database and
... Show MoreThe main purpose from this paper is to introduce a new kind of soft open sets in soft
topological spaces called soft omega open sets and we show that the collection of
every soft omega open sets in a soft topological space (X,~,E) forms a soft topology
~
on X which is soft finer than ~
. Moreover we use soft omega open sets to define
and study new classes of soft functions called weakly soft omega open functions and
weakly soft omega closed functions which are weaker than weakly soft open functions
and weakly soft closed functions respectively. We obtain their basic properties, their
characterizations, and their relationships with other kinds of soft functions between
soft topological spaces.<
The linear non-polynomial spline is used here to solve the fractional partial differential equation (FPDE). The fractional derivatives are described in the Caputo sense. The tensor products are given for extending the one-dimensional linear non-polynomial spline to a two-dimensional spline to solve the heat equation. In this paper, the convergence theorem of the method used to the exact solution is proved and the numerical examples show the validity of the method. All computations are implemented by Mathcad15.
Survival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreIn the present work the Buildup factor for gamma rays were studied in shields from epoxy reinforced by lead powder and by aluminum powder, for NaI(Tl) scintillation detector size ( ×? ), using two radioactive sources (Co-60 and Cs-137). The shields which are used (epoxy reinforced by lead powder with concentration (10-60)% and epoxy reinforced by aluminum powder with concentration (10-50)% by thick (6mm) and epoxy reinforced by lead powder with concentration (50%) with thick (2,4,6,8,10)mm. The experimental results show that: The linear absorption factor and Buildup factor increase with increase the concentration for the powders which used in reinforcement and high for aluminum powder than the lead powder and decrease with inc
... Show MoreGross domestic product (GDP) is an important measure of the size of the economy's production. Economists use this term to determine the extent of decline and growth in the economies of countries. It is also used to determine the order of countries and compare them to each other. The research aims at describing and analyzing the GDP during the period from 1980 to 2015 and for the public and private sectors and then forecasting GDP in subsequent years until 2025. To achieve this goal, two methods were used: linear and nonlinear regression. The second method in the time series analysis of the Box-Jenkins models and the using of statistical package (Minitab17), (GRETLW32)) to extract the results, and then comparing the two methods, T
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreIn many applications such as production, planning, the decision maker is important in optimizing an objective function that has fuzzy ratio two functions which can be handed using fuzzy fractional programming problem technique. A special class of optimization technique named fuzzy fractional programming problem is considered in this work when the coefficients of objective function are fuzzy. New ranking function is proposed and used to convert the data of the fuzzy fractional programming problem from fuzzy number to crisp number so that the shortcoming when treating the original fuzzy problem can be avoided. Here a novel ranking function approach of ordinary fuzzy numbers is adopted for ranking of triangular fuzzy numbers with simpler an
... Show More