The basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.
Teen-Computer Interaction (TeenCI) stands in an infant phase and emerging in positive path. Compared to Human-Computer Interaction (generally dedicated to adult) and Child-Computer Interaction, TeenCI gets less interest in terms of research efforts and publications. This has revealed extensive prospects for researchers to explore and contribute in the region of computer design and evaluation for teen, in specific. As a subclass of HCI and a complementary for CCI, TeenCI that tolerates teen group, should be taken significant concern in the sense of its context, nature, development, characteristics and architecture. This paper tends to discover teen’s emotion contribution as the first attempt towards building a conceptual model for TeenC
... Show MoreBlockchain is an innovative technology that has gained interest in all sectors in the era of digital transformation where it manages transactions and saves them in a database. With the increasing financial transactions and the rapidly developed society with growing businesses many people looking for the dream of a better financially independent life, stray from large corporations and organizations to form startups and small businesses. Recently, the increasing demand for employees or institutes to prepare and manage contracts, papers, and the verifications process, in addition to human mistakes led to the emergence of a smart contract. The smart contract has been developed to save time and provide more confidence while dealing, as well a
... Show MoreThe using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible models of parametric models and these models were nonparametric models.
In this manuscript were compared to the so-called Nadaraya-Watson estimator in two cases (use of fixed bandwidth and variable) through simulation with different models and samples sizes. Through simulation experiments and the results showed that for the first and second models preferred NW with fixed bandwidth fo
... Show MoreIn this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the
... Show MoreIn this paper, the error distribution function is estimated for the single index model by the empirical distribution function and the kernel distribution function. Refined minimum average variance estimation (RMAVE) method is used for estimating single index model. We use simulation experiments to compare the two estimation methods for error distribution function with different sample sizes, the results show that the kernel distribution function is better than the empirical distribution function.
In this paper we present the theoretical foundation of forward error analysis of numerical algorithms under;• Approximations in "built-in" functions.• Rounding errors in arithmetic floating-point operations.• Perturbations of data.The error analysis is based on linearization method. The fundamental tools of the forward error analysis are system of linear absolute and relative a prior and a posteriori error equations and associated condition numbers constituting optimal of possible cumulative round – off errors. The condition numbers enable simple general, quantitative bounds definitions of numerical stability. The theoretical results have been applied a Gaussian elimination, and have proved to be very effective means of both a prior
... Show MoreIn this paper, we study the growth of solutions of the second order linear complex differential equations insuring that any nontrivial solutions are of infinite order. It is assumed that the coefficients satisfy the extremal condition for Yang’s inequality and the extremal condition for Denjoy’s conjecture. The other condition is that one of the coefficients itself is a solution of the differential equation .
Most of the Weibull models studied in the literature were appropriate for modelling a continuous random variable which assumes the variable takes on real values over the interval [0,∞]. One of the new studies in statistics is when the variables take on discrete values. The idea was first introduced by Nakagawa and Osaki, as they introduced discrete Weibull distribution with two shape parameters q and β where 0 < q < 1 and b > 0. Weibull models for modelling discrete random variables assume only non-negative integer values. Such models are useful for modelling for example; the number of cycles to failure when components are subjected to cyclical loading. Discrete Weibull models can be obta
... Show MoreThe issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of
... Show More