This Book is intended to be textbook studied for undergraduate course in multivariate analysis. This book is designed to be used in semester system. In order to achieve the goals of the book, it is divided into the following chapters (as done in the first edition 2019). Chapter One introduces matrix algebra. Chapter Two devotes to Linear Equation System Solution with quadratic forms, Characteristic roots & vectors. Chapter Three discusses Partitioned Matrices and how to get Inverse, Jacobi and Hessian matrices. Chapter Four deals with Multivariate Normal Distribution (MVN). Chapter Five concern with Joint, Marginal and Conditional Normal Distribution, independency and correlations. While the revised new chapters have been added (as the current second edition 2024). Chapter six introduces mean vector estimation and covariance matrix estimation. Chapter seven devotes to testing concerning mean: one sample mean, and two sample mean. Chapter eight discusses special case of factorial analysis which is principal components analysis. Chapter nine deals with discriminant analysis. While chapter ten deals with cluster analysis. Many solved examples are intended in this book, in addition to a variety of unsolved relied problems at the end of each chapter to enrich the statistical knowledge of the readers.
In this research we solved numerically Boltzmann transport equation in order to calculate the transport parameters, such as, drift velocity, W, D/? (ratio of diffusion coefficient to the mobility) and momentum transfer collision frequency ?m, for purpose of determination of magnetic drift velocity WM and magnetic deflection coefficient ? for low energy electrons, that moves in the electric field E, crossed with magnetic field B, i.e; E×B, in the nitrogen, Argon, Helium and it's gases mixtures as a function of: E/N (ratio of electric field strength to the number density of gas), E/P300 (ratio of electric field strength to the gas pressure) and D/? which covered a different ranges for E/P300 at temperatures 300°k (Kelvin). The results show
... Show MoreGrey system theory is a multidisciplinary scientific approach, which deals with systems that have partially unknown information (small sample and uncertain information). Grey modeling as an important component of such theory gives successful results with limited amount of data. Grey Models are divided into two types; univariate and multivariate grey models. The univariate grey model with one order derivative equation GM (1,1) is the base stone of the theory, it is considered the time series prediction model but it doesn’t take the relative factors in account. The traditional multivariate grey models GM(1,M) takes those factor in account but it has a complex structure and some defects in " modeling mechanism", "parameter estimation "and "m
... Show MoreThis research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram, and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods.
The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreThis paper is specifically a detailed review of the Spatial Quantile Autoregressive (SARQR) model that refers to the incorporation of quantile regression models into spatial autoregressive models to facilitate an improved analysis of the characteristics of spatially dependent data. The relevance of SARQR is emphasized in most applications, including but not limited to the fields that might need the study of spatial variation and dependencies. In particular, it looks at literature dated from 1971 and 2024 and shows the extent to which SARQR had already been applied previously in other disciplines such as economics, real estate, environmental science, and epidemiology. Accordingly, evidence indicates SARQR has numerous benefits compar
... Show MoreThis paper is specifically a detailed review of the Spatial Quantile Autoregressive (SARQR) model that refers to the incorporation of quantile regression models into spatial autoregressive models to facilitate an improved analysis of the characteristics of spatially dependent data. The relevance of SARQR is emphasized in most applications, including but not limited to the fields that might need the study of spatial variation and dependencies. In particular, it looks at literature dated from 1971 and 2024 and shows the extent to which SARQR had already been applied previously in other disciplines such as economics, real estate, environmental science, and epidemiology. Accordingly, evidence indicates SARQR has numerous benefits compar
... Show MoreIn recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show MoreThere is a great operational risk to control the day-to-day management in water treatment plants, so water companies are looking for solutions to predict how the treatment processes may be improved due to the increased pressure to remain competitive. This study focused on the mathematical modeling of water treatment processes with the primary motivation to provide tools that can be used to predict the performance of the treatment to enable better control of uncertainty and risk. This research included choosing the most important variables affecting quality standards using the correlation test. According to this test, it was found that the important parameters of raw water: Total Hardn
In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show More