The huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
The approach given in this paper leads to numerical methods to find the approximate solution of volterra integro –diff. equ.1st kind. First, we reduce it from integro VIDEs to integral VIEs of the 2nd kind by using the reducing theory, then we use two types of Non-polynomial spline function (linear, and quadratic). Finally, programs for each method are written in MATLAB language and a comparison between these two types of Non-polynomial spline function is made depending on the least square errors and running time. Some test examples and the exact solution are also given.
In this paper, the continuous classical boundary optimal control problem (CCBOCP) for triple linear partial differential equations of parabolic type (TLPDEPAR) with initial and boundary conditions (ICs & BCs) is studied. The Galerkin method (GM) is used to prove the existence and uniqueness theorem of the state vector solution (SVS) for given continuous classical boundary control vector (CCBCV). The proof of the existence theorem of a continuous classical boundary optimal control vector (CCBOCV) associated with the TLPDEPAR is proved. The derivation of the Fréchet derivative (FrD) for the cost function (CoF) is obtained. At the end, the theorem of the necessary conditions for optimality (NCsThOP) of this problem is stated and prov
... Show MoreExamination of skewness makes academics more aware of the importance of accurate statistical analysis. Undoubtedly, most phenomena contain a certain percentage of skewness which resulted to the appearance of what is -called "asymmetry" and, consequently, the importance of the skew normal family . The epsilon skew normal distribution ESN (μ, σ, ε) is one of the probability distributions which provide a more flexible model because the skewness parameter provides the possibility to fluctuate from normal to skewed distribution. Theoretically, the estimation of linear regression model parameters, with an average error value that is not zero, is considered a major challenge due to having difficulties, as no explicit formula to calcula
... Show MoreSome maps of the chaotic firefly algorithm were selected to select variables for data on blood diseases and blood vessels obtained from Nasiriyah General Hospital where the data were tested and tracking the distribution of Gamma and it was concluded that a Chebyshevmap method is more efficient than a Sinusoidal map method through mean square error criterion.
Mixture experiments are response variables based on the proportions of component for this mixture. In our research we will compare the scheffʼe model with the kronecker model for the mixture experiments, especially when the experimental area is restricted.
Because of the experience of the mixture of high correlation problem and the problem of multicollinearity between the explanatory variables, which has an effect on the calculation of the Fisher information matrix of the regression model.
to estimate the parameters of the mixture model, we used the (generalized inverse ) And the Stepwise Regression procedure
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreCuring of concrete is the maintenance of a satisfactory moisture content and temperature for a
period of time immediately following placing so the desired properties are developed. Accelerated
curing is advantages where early strength gain in concrete is important. The expose of concrete
specimens to the accelerated curing conditions which permit the specimens to develop a significant
portion of their ultimate strength within a period of time (1-2 days), depends on the method of the
curing cycle.Three accelerated curing test methods are adopted in this study. These are warm water,
autogenous and proposed test methods. The results of this study has shown good correlation
between the accelerated strength especially for
Background/Objectives: The purpose of current research aims to a modified image representation framework for Content-Based Image Retrieval (CBIR) through gray scale input image, Zernike Moments (ZMs) properties, Local Binary Pattern (LBP), Y Color Space, Slantlet Transform (SLT), and Discrete Wavelet Transform (DWT). Methods/Statistical analysis: This study surveyed and analysed three standard datasets WANG V1.0, WANG V2.0, and Caltech 101. The features an image of objects in this sets that belong to 101 classes-with approximately 40-800 images for every category. The suggested infrastructure within the study seeks to present a description and operationalization of the CBIR system through automated attribute extraction system premised on CN
... Show MoreDeepfake is a type of artificial intelligence used to create convincing images, audio, and video hoaxes and it concerns celebrities and everyone because they are easy to manufacture. Deepfake are hard to recognize by people and current approaches, especially high-quality ones. As a defense against Deepfake techniques, various methods to detect Deepfake in images have been suggested. Most of them had limitations, like only working with one face in an image. The face has to be facing forward, with both eyes and the mouth open, depending on what part of the face they worked on. Other than that, a few focus on the impact of pre-processing steps on the detection accuracy of the models. This paper introduces a framework design focused on this asp
... Show MoreUniversal image stego-analytic has become an important issue due to the natural images features curse of dimensionality. Deep neural networks, especially deep convolution networks, have been widely used for the problem of universal image stegoanalytic design. This paper describes the effect of selecting suitable value for number of levels during image pre-processing with Dual Tree Complex Wavelet Transform. This value may significantly affect the detection accuracy which is obtained to evaluate the performance of the proposed system. The proposed system is evaluated using three content-adaptive methods, named Highly Undetetable steGO (HUGO), Wavelet Obtained Weights (WOW) and UNIversal WAvelet Relative Distortion (UNIWARD).
The obtain