This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
We are used Bayes estimators for unknown scale parameter when shape Parameter is known of Erlang distribution. Assuming different informative priors for unknown scale parameter. We derived The posterior density with posterior mean and posterior variance using different informative priors for unknown scale parameter which are the inverse exponential distribution, the inverse chi-square distribution, the inverse Gamma distribution, and the standard Levy distribution as prior. And we derived Bayes estimators based on the general entropy loss function (GELF) is used the Simulation method to obtain the results. we generated different cases for the parameters of the Erlang model, for different sample sizes. The estimates have been comp
... Show MoreA reliability system of the multi-component stress-strength model R(s,k) will be considered in the present paper ,when the stress and strength are independent and non-identically distribution have the Exponentiated Family Distribution(FED) with the unknown shape parameter α and known scale parameter λ equal to two and parameter θ equal to three. Different estimation methods of R(s,k) were introduced corresponding to Maximum likelihood and Shrinkage estimators. Comparisons among the suggested estimators were prepared depending on simulation established on mean squared error (MSE) criteria.
In this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.
A seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MorePortable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fu
... Show MoreSurvival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreIn this paper, integrated quantum neural network (QNN), which is a class of feedforward
neural networks (FFNN’s), is performed through emerging quantum computing (QC) with artificial neural network(ANN) classifier. It is used in data classification technique, and here iris flower data is used as a classification signals. For this purpose independent component analysis (ICA) is used as a feature extraction technique after normalization of these signals, the architecture of (QNN’s) has inherently built in fuzzy, hidden units of these networks (QNN’s) to develop quantized representations of sample information provided by the training data set in various graded levels of certainty. Experimental results presented here show that
... Show MoreThis study aim to identify the concept of web based information systems since its one of the important topics that is usually omitted by our organizations, in addition to, designing a web based information system in order to manage the customers data of Al- Rasheed bank, as a unified information system that is specialized to the banking deals of the customers with the bank, and providing a suggested model to apply the virtual private network as a tool that is to protect the transmitted data through the web based information system.
This study is considered important because it deals with one of the vital topics nowadays, namely: how to make it possible to use a distributed informat
... Show MoreThis study produces an image of theoretical and experimental case of high loading stumbling condition for hip prosthesis. Model had been studied namely Charnley. This model was modeled with finite element method by using ANSYS software, the effect of changing the design parameters (head diameter, neck length, neck ratio, stem length) on Charnley design, for stumbling case as impact load where the load reach to (8.7* body weight) for impact duration of 0.005sec.An experimental rig had been constructed to test the hip model, this rig consist of a wood box with a smooth sliding shaft where a load of 1 pound is dropped from three heights.
The strain produced by this impact is measured by using rosette strain gauge connected to Wheatstone
<p>Energy and memory limitations are considerable constraints of sensor nodes in wireless sensor networks (WSNs). The limited energy supplied to network nodes causes WSNs to face crucial functional limitations. Therefore, the problem of limited energy resource on sensor nodes can only be addressed by using them efficiently. In this research work, an energy-balancing routing scheme for in-network data aggregation is presented. This scheme is referred to as Energy-aware and load-Balancing Routing scheme for Data Aggregation (hereinafter referred to as EBR-DA). The EBRDA aims to provide an energy efficient multiple-hop routing to the destination on the basis of the quality of the links between the source and destination. In
... Show More