The huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
With its rapid spread, the coronavirus infection shocked the world and had a huge effect on billions of peoples' lives. The problem is to find a safe method to diagnose the infections with fewer casualties. It has been shown that X-Ray images are an important method for the identification, quantification, and monitoring of diseases. Deep learning algorithms can be utilized to help analyze potentially huge numbers of X-Ray examinations. This research conducted a retrospective multi-test analysis system to detect suspicious COVID-19 performance, and use of chest X-Ray features to assess the progress of the illness in each patient, resulting in a "corona score." where the results were satisfactory compared to the benchmarked techniques. T
... Show MoreA new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducte
... Show MoreThe present paper stresses the direct effect of the situational dimension termed as “reality” on the authors’ thoughts and attitudes. Every text is placed within a particular situation which has to be correctly identified by the translator as the first and the most important step for a good translation. Hence, the content of any word production reflects some part of reality. Comprehending any text includes comprehending the reality’s different dimensions as reflected in the text and, thus illuminating the connection of reality features.
Аннотация
Исследование под названием ((«Понимание реальности» средство полно
... Show MoreThe article considers semantic and stylistic motivations for using obsolete lexicon (historicisms) in the text of a work of art. The specifics of the functioning of this process are presented against the background of the features of the contemporary Russian literary language. Attention is focused on the fact that the layer of obsolete lexical units belongs to a number of nationally specific vocabulary, the development of which forms an understanding of the nature of the actualized language. In addition, it should be noted that the semantics of historicisms is culturally commensurate: the latter is explained by the fact that the deactuation of linguistic units is positioned as parallel to the sociocultural and political changes.
... Show MoreFuzzy numbers are used in various fields such as fuzzy process methods, decision control theory, problems involving decision making, and systematic reasoning. Fuzzy systems, including fuzzy set theory. In this paper, pentagonal fuzzy variables (PFV) are used to formulate linear programming problems (LPP). Here, we will concentrate on an approach to addressing these issues that uses the simplex technique (SM). Linear programming problems (LPP) and linear programming problems (LPP) with pentagonal fuzzy numbers (PFN) are the two basic categories into which we divide these issues. The focus of this paper is to find the optimal solution (OS) for LPP with PFN on the objective function (OF) and right-hand side. New ranking f
... Show MoreIn this work, we prove that the triple linear partial differential equations (PDEs) of elliptic type (TLEPDEs) with a given classical continuous boundary control vector (CCBCVr) has a unique "state" solution vector (SSV) by utilizing the Galerkin's method (GME). Also, we prove the existence of a classical continuous boundary optimal control vector (CCBOCVr) ruled by the TLEPDEs. We study the existence solution for the triple adjoint equations (TAJEs) related with the triple state equations (TSEs). The Fréchet derivative (FDe) for the objective function is derived. At the end we prove the necessary "conditions" theorem (NCTh) for optimality for the problem.
Mixture experiments are response variables based on the proportions of component for this mixture. In our research we will compare the scheffʼe model with the kronecker model for the mixture experiments, especially when the experimental area is restricted.
Because of the experience of the mixture of high correlation problem and the problem of multicollinearity between the explanatory variables, which has an effect on the calculation of the Fisher information matrix of the regression model.
to estimate the parameters of the mixture model, we used the (generalized inverse ) And the Stepwise Regression procedure
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreSome maps of the chaotic firefly algorithm were selected to select variables for data on blood diseases and blood vessels obtained from Nasiriyah General Hospital where the data were tested and tracking the distribution of Gamma and it was concluded that a Chebyshevmap method is more efficient than a Sinusoidal map method through mean square error criterion.
Curing of concrete is the maintenance of a satisfactory moisture content and temperature for a
period of time immediately following placing so the desired properties are developed. Accelerated
curing is advantages where early strength gain in concrete is important. The expose of concrete
specimens to the accelerated curing conditions which permit the specimens to develop a significant
portion of their ultimate strength within a period of time (1-2 days), depends on the method of the
curing cycle.Three accelerated curing test methods are adopted in this study. These are warm water,
autogenous and proposed test methods. The results of this study has shown good correlation
between the accelerated strength especially for