This Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters (as done in the first edition 2019). Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. While the revised new chapters have been added (as the current second edition 2024). Chapter six introduces mean vector estimation and covariance matrix estimation. Chapter seven devotes to testing concerning mean: one sample mean, and two sample mean. Chapter eight discusses special case of factorial analysis which is principal components analysis. Chapter nine deals with discriminant analysis. While chapter ten deals with cluster analysis. Many solved examples are intended in this book, in addition to a variety of unsolved relied problems at the end of each chapter to enrich the statistical knowledge of the readers.
The process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material
... Show MoreUrban land price is the primary indicator of land development in urban areas. Land prices in holly cities have rapidly increased due to tourism and religious activities. Public agencies are usually facing challenges in managing land prices in religious areas. Therefore, they require developed models or tools to understand land prices within religious cities. Predicting land prices can efficiently retain future management and develop urban lands within religious cities. This study proposed a new methodology to predict urban land prices within holy cities. The methodology is based on two models, Linear Regression (LR) and Support Vector Regression (SVR), and nine variables (land price, land area,
... Show MoreThe speaker identification is one of the fundamental problems in speech processing and voice modeling. The speaker identification applications include authentication in critical security systems and the accuracy of the selection. Large-scale voice recognition applications are a major challenge. Quick search in the speaker database requires fast, modern techniques and relies on artificial intelligence to achieve the desired results from the system. Many efforts are made to achieve this through the establishment of variable-based systems and the development of new methodologies for speaker identification. Speaker identification is the process of recognizing who is speaking using the characteristics extracted from the speech's waves like pi
... Show MoreThe financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine
... Show MoreIn this paper, point estimation for parameter ? of Maxwell-Boltzmann distribution has been investigated by using simulation technique, to estimate the parameter by two sections methods; the first section includes Non-Bayesian estimation methods, such as (Maximum Likelihood estimator method, and Moment estimator method), while the second section includes standard Bayesian estimation method, using two different priors (Inverse Chi-Square and Jeffrey) such as (standard Bayes estimator, and Bayes estimator based on Jeffrey's prior). Comparisons among these methods were made by employing mean square error measure. Simulation technique for different sample sizes has been used to compare between these methods.
The inhibitive action of Phenyl Thiourea (PTU) on the corrosion of mild steel in strong Hydrochloric acid, HCl, has been investigated by weight loss and potentiostatic polarization. The effect of PTU concentration, HCl concentration, and temperature on corrosion rate of mild steel were verified using 2 levels factorial design and surface response analysis through weight loss approach, while the electrochemical measurements were used to study the behavior of mild steel in 5-7N HCl at temperatures 30, 40 and 50 °C, in absence and presence of PTU. It was verified that all variables and their interaction were statistically significant. The adsorption of (PTU) is found to obey the Langmuir adsorption isotherm. The effect of temperature on th
... Show MoreIn front of the serious deterioration of the elements of the environment, new convictions arose the need to integrate into the global environmental concerns as being one and the issue of shared responsibility and the impact of this conviction, the evolution of the environment protection law in many countries, including Algeria. Due to the multiplicity of perceptions about the environmental result of multiple scientific disciplines, the legislative concept emerged to protect the environment, which includes prevention and rational management and conservation and restoration and repair.
Environmental planning for the various governments and countries aims to avert disasters and achieve the
... Show MoreAn effective two-body density operator for point nucleon system
folded with the tenser force correlations( TC's), is produced and used
to derive an explicit form for ground state two-body charge density
distributions (2BCDD's) applicable for 25Mg, 27Al and 29Si nuclei. It is
found that the inclusion of the two-body TC's has the feature of
increasing the central part of the 2BCDD's significantly and reducing
the tail part of them slightly, i.e. it tends to increase the probability of
transferring the protons from the surface of the nucleus towards its
centeral region and consequently makes the nucleus to be more rigid
than the case when there is no TC's and also leads to decrease the
1/ 2
r 2 of the nucleu
The quadrupole moment of 14B exotic nucleus has been calculated using configuration mixing shell model with limiting number of orbital's in the model space. The core- polarization effects, are included through a microscopic theory which considers a particle-hole excitations from the core and the model space orbits into the higher orbits with 6ħω excitations using M3Y interaction. The simple harmonic oscillator potential is used to generate the single particle wave functions. Large basis no-core shell model with (0+2)ћω truncation is used for 14B nucleus. The effective charges for the protons and neutrons were calculated su |
A substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show More