This Book is intended to be a textbook studied for undergraduate course in financial statistics/ department of Financial Sciences and Banking. This book is designed to be used in semester system. To achieve the goals of the book, it is divided into the following chapters. Chapter one introduces basic concepts. Chapter two devotes to frequency distribution and data representation. Chapter three discusses central tendency measures (all types of means, mode, and median). Chapter four deals with dispersion Measures (standard deviation, variance, and coefficient of variation). Chapter five concerned with correlation and regression analysis. While chapter six concerned with testing Hypotheses (One population mean test, Two "independent" population means test, Proportions Test, Chi-square Test of Independence). Many solved examples are intended in this book, in addition to a variety of unsolved relied exercise at the end of each chapter to enrich the statistical knowledge of our students.
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreIn this paper, two elements of the multi-input multi-output (MIMO) antenna had been used to study the five (3.1-3.55GHz and 3.7-4.2GHz), (3.4-4.7 GHz), (3.4-3.8GHz) and (3.6-4.2GHz) 5G bands of smartphone applications that is to be introduced to the respective US, Korea, (Europe and China) and Japan markets. With a proposed dimension of 26 × 46 × 0.8 mm3, the medium-structured and small-sized MIMO antenna was not only found to have demonstrated a high degree of isolation and efficiency, it had also exhibited a lower level of envelope correlation coefficient and return loss, which are well-suited for the 5G bands application. From the fabrication of an inexpensive FR4 substrate with a 0.8 mm thickness level, a loss tang
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreIt is well-known that the existence of outliers in the data will adversely affect the efficiency of estimation and results of the current study. In this paper four methods will be studied to detect outliers for the multiple linear regression model in two cases : first, in real data; and secondly, after adding the outliers to data and the attempt to detect it. The study is conducted for samples with different sizes, and uses three measures for comparing between these methods . These three measures are : the mask, dumping and standard error of the estimate.
The present work investigates the effect of; superficial air velocities of: 1, 3, and 6 cm/s for two types of perforated distributor on hydrodynamic characteristic in a gas-liquid dispersion column of; air-water, and airaqueous-n-propanol solution. Bubble distribution, gas holdup, and power consumption are parameters take in consideration. Experimental work was carried out in perspex column of 8.5 cm inside diameter and 1.5 m height. Two types of bubble generator (perforated plate) were fixed at the bottom of the column; plate A (99 holes of 0.5 mm diameter and free area of 0.34%), plate B (20 holes of 1.5 mm diameter and free area of 0.62%). Photographic technique was used to measure the bubble parameters. The experimental results were
... Show MoreThis work focuses on the use of biologically produced activated carbon for improving the physi-co-chemical properties of water samples obtained from the Tigris River. An eco-friendly and low-cost activated carbon was prepared from the Alhagi plant using potassium hydroxide (KOH) as an impregnation agent. The prepared activated carbon was characterised using Fourier-transform infrared spectroscopy to determine the functional groups that exist on the raw material (Alhagi plant) and Alhagi activated carbon (AAC). Scanning electron microscope–energy-dispersive X-ray spectroscope was also used to investigate the surface shape and the elements that compose the powder. Brunauer–Emmett–Teller surface area analysis was used to evaluate the spe
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreClothes are considered a means of aesthetic and artistic expression that help to hide the flaws of the body and highlight its merits , it has importance in people's lives as it reflects the individual's idea of himself and his personality. Whereas the appreciation in clothing is a reflection of a person's sense of artistic components and the application of this sense to the clothes of his choice. Regarding the differences in clothing tastes by the university students according to the following variables (gender, specialization, stage of study, age, monthly income), the current research is considered quantitative descriptive research that is concerned with studying a phenomenon that exists in reality, measuring it
... Show More