This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
The Internet of Things (IoT) is a network of devices used for interconnection and data transfer. There is a dramatic increase in IoT attacks due to the lack of security mechanisms. The security mechanisms can be enhanced through the analysis and classification of these attacks. The multi-class classification of IoT botnet attacks (IBA) applied here uses a high-dimensional data set. The high-dimensional data set is a challenge in the classification process due to the requirements of a high number of computational resources. Dimensionality reduction (DR) discards irrelevant information while retaining the imperative bits from this high-dimensional data set. The DR technique proposed here is a classifier-based fe
... Show MoreThis paper deals with the estimation of the stress strength reliability for a component which has a strength that is independent on opposite lower and upper bound stresses, when the stresses and strength follow Inverse Kumaraswamy Distribution. D estimation approaches were applied, namely the maximum likelihood, moment, and shrinkage methods. Monte Carlo simulation experiments were performed to compare the estimation methods based on the mean squared error criteria.
In this paper the use of a circular array antenna with adaptive system in conjunction with modified Linearly Constrained Minimum Variance Beam forming (LCMVB) algorithm is proposed to meet the requirement of Angle of Arrival (AOA) estimation in 2-D as well as the Signal to Noise Ratio (SNR) of estimated sources (Three Dimensional 3-D estimation), rather than interference cancelation as it is used for. The proposed system was simulated, tested and compared with the modified Multiple Signal Classification (MUSIC) technique for 2-D estimation. The results show the system has exhibited astonishing results for simultaneously estimating 3-D parameters with accuracy approximately equivalent to the MUSIC technique (for estimating elevation and a
... Show MoreThe depth of causative source of gravity is one of the most important parameter
of gravity investigation. Present study introduces the theoretical solve of the
intersection point of the horizontal and vertical gradients of gravity anomaly. Two
constants are obtained to estimate the depth of causative source of gravity anomaly,
first one is 1.7807 for spherical body and the second is 2.4142 for the horizontal
cylinder body. These constants are tested for estimating the depth of three actual
cases and good results are obtained. It is believed that the constants derived on
theoretical bases are better than those obtained by empirical experimental studies.
In this work, the methods (Moments, Modified Moments, L-Moments, Percentile, Rank Set sampling and Maximum Likelihood) were used to estimate the reliability function and the two parameters of the Transmuted Pareto (TP) distribution. We use simulation to generate the required data from three cases this indicates sample size , and it replicates for the real value for parameters, for reliability times values we take .
Results were compared by using mean square error (MSE), the result appears as follows :
The best methods are Modified Moments, Maximum likelihood and L-Moments in first case, second case and third case respectively.
A new procedure of depth estimation to the apex of dyke-like sources from
magnetic data has been achieved through the application of a derived equation. The
procedure consists of applying a simple filtering technique to the total magnetic
intensity data profiles resulting from dyke-like bodies, having various depths, widths
and inclination angles. A background trending line is drawn for the filtered profile
and the output profile is considered for further calculations.
Two straight lines are drawn along the maximum slopes of the filtered profile
flanks. Then, the horizontal distances between the two lines at various amplitude
levels are measured and plotted against the amplitudes and the resulted relation is a
In this research, we built a program to assess Weibull parameters and wind power of three separate locations in Iraq: Baghdad, Basrah and Dhi-qar for two years 2009 and 2010, after collecting and setting the data available from the website "Weather Under Ground" for each of the stations Baghdad, Basrah and Dhi-qar. Weibull parameters (shape parameter and scale parameter) were estimated using maximum likelihood estimation method (MLE) and least squares method (LSM). Also, the annual wind speed frequencies were calculated noting speed most readily available through the above two years. Then, we plotted Weibull distribution function and calculate the most significant quantities represented by mean wind speed, standard deviation of the value
... Show Moreالمقدمة
تتعامل الجهات الضريبية في مختلف دول العالم بأساليب عديدة لجباية الضرائب من المكلفين بغض النظر عن فئات وأصناف هؤلاء المكلفين،وفي العراق تم اعتماد العديد من الأساليب لجباية الضرائب على امتداد المدد الزمنية المتعاقبة،وكان لأسلوب التقدير الذاتي وهو أحد تلك الأساليب مجالاً للتطبيق خلال مدة زمنية معينة،حيث جرى تطبيق هذا الأسلوب على وحدات اقتصادية معينة، وبالرغم من المساوئ التي قد ترافق تطبيق
... Show MoreAbstract
The current research aims to reveal the extent to which all scoring rubrics data for the electronic work file conform to the partial estimation model according to the number of assumed dimensions. The study sample consisted of (356) female students. The study concluded that the list with the one-dimensional assumption is more appropriate than the multi-dimensional assumption, The current research recommends preparing unified correction rules for the different methods of performance evaluation in the basic courses. It also suggests the importance of conducting studies aimed at examining the appropriateness of different evaluation methods for models of response theory to the
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution