studied, and its important properties and relationship with both closed and open Nano sets were investigated. The new Nano sets were linked to the concept of Nano ideal, the development of nano ideal mildly closed set and it has been studied its properties. In addition to the applied aspect of the research, a sample was taken from patients infected with viral hepatitis, and by examining the infected people and using closed and open (nano mildly. and nano ideal mildly) sets, the important symptoms that constitute the core of this dangerous examining the infected people and using closed and open (nano mildly. and nano ideal mildly) sets, the important symptoms that constitute the core of this dangerous disease.
The basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.
In this work we explain and discuss new notion of fibrewise topological spaces, calledfibrewise soft ideal topological spaces, Also, we show the notions of fibrewise closed soft ideal topological spaces, fibrewise open soft ideal topological spaces and fibrewise soft near ideal topological spaces.
The main idea of this research is to study fibrewise pairwise soft forms of the more important separation axioms of ordinary bitopology named fibrewise pairwise soft
R. Vasuki [1] proved fixed point theorems for expansive mappings in Menger spaces. R. Gujetiya and et al [2] presented an extension of the main result of Vasuki, for four expansive mappings in Menger space. In this article, an important lemma is given to prove that the iteration sequence is Cauchy under suitable condition in Menger probabilistic G-metric space (shortly, MPGM-space). And then, used to obtain three common fixed point theorems for expansive type mappings.
<p>In this paper, we prove there exists a coupled fixed point for a set- valued contraction mapping defined on X× X , where X is incomplete ordered G-metric. Also, we prove the existence of a unique fixed point for single valued mapping with respect to implicit condition defined on a complete G- metric.</p>
A space X is named a πp – normal if for each closed set F and each π – closed set F’ in X with F ∩ F’ = ∅, there are p – open sets U and V of X with U ∩ V = ∅ whereas F ⊆ U and F’ ⊆ V. Our work studies and discusses a new kind of normality in generalized topological spaces. We define ϑπp – normal, ϑ–mildly normal, & ϑ–almost normal, ϑp– normal, & ϑ–mildly p–normal, & ϑ–almost p-normal and ϑπ-normal space, and we discuss some of their properties.
In this paper, we introduce the concept of fuzzy n-fold KUideal in KU-algebras, which is a generalization of fuzzy KU-ideal of KUalgebras and we obtain a few properties that is similar to the properties of fuzzy KU-ideal in KU-algebras, see [8]. Furthermore, we construct some algorithms for folding theory applied to KU-ideals in KU-algebras.
Spelling correction is considered a challenging task for resource-scarce languages. The Arabic language is one of these resource-scarce languages, which suffers from the absence of a large spelling correction dataset, thus datasets injected with artificial errors are used to overcome this problem. In this paper, we trained the Text-to-Text Transfer Transformer (T5) model using artificial errors to correct Arabic soft spelling mistakes. Our T5 model can correct 97.8% of the artificial errors that were injected into the test set. Additionally, our T5 model achieves a character error rate (CER) of 0.77% on a set that contains real soft spelling mistakes. We achieved these results using a 4-layer T5 model trained with a 90% error inject
... Show More