This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
Abstract-Servo motors are important parts of industry automation due to their several advantages such as cost and energy efficiency, simple design, and flexibility. However, the position control of the servo motor is a difficult task because of different factors of external disturbances, nonlinearities, and uncertainties. To tackle these challenges, an adaptive integral sliding mode control (AISMC) is proposed, in which a novel bidirectional adaptive law is constructed to reduce the control chattering. The proposed control has three steps to be designed. Firstly, a full-order integral sliding manifold is designed to improve the servo motor position tracking performance, in which the reaching phase is eliminated to achieve the invariance of
... Show MoreA seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreIn this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show MoreThe last two decades have seen a marked increase in the illegal activities on the Dark Web. Prompt evolvement and use of sophisticated protocols make it difficult for security agencies to identify and investigate these activities by conventional methods. Moreover, tracing criminals and terrorists poses a great challenge keeping in mind that cybercrimes are no less serious than real life crimes. At the same time, computer security societies and law enforcement pay a great deal of attention on detecting and monitoring illegal sites on the Dark Web. Retrieval of relevant information is not an easy task because of vastness and ever-changing nature of the Dark Web; as a result, web crawlers play a vital role in achieving this task. The
... Show MoreThe dependable and efficient identification of Qin seal script characters is pivotal in the discovery, preservation, and inheritance of the distinctive cultural values embodied by these artifacts. This paper uses image histograms of oriented gradients (HOG) features and an SVM model to discuss a character recognition model for identifying partial and blurred Qin seal script characters. The model achieves accurate recognition on a small, imbalanced dataset. Firstly, a dataset of Qin seal script image samples is established, and Gaussian filtering is employed to remove image noise. Subsequently, the gamma transformation algorithm adjusts the image brightness and enhances the contrast between font structures and image backgrounds. After a s
... Show MoreThis study aim to identify the concept of web based information systems since its one of the important topics that is usually omitted by our organizations, in addition to, designing a web based information system in order to manage the customers data of Al- Rasheed bank, as a unified information system that is specialized to the banking deals of the customers with the bank, and providing a suggested model to apply the virtual private network as a tool that is to protect the transmitted data through the web based information system.
This study is considered important because it deals with one of the vital topics nowadays, namely: how to make it possible to use a distributed informat
... Show MoreDerivative spectrophotometry is one of the analytical chemistry techniques used
in the analysis and determination of chemicals and pharmaceuticals. This method is
characterized by simplicity, sensitivity and speed. Derivative of Spectra conducted
in several ways, including optical, electronic and mathematical. This operation
usually be done within spectrophotometer. The paper is based on form of a new
program. The program construction is written in Visual Basic language within
Microsoft Excel. The program is able to transform the first, second, third and fourth
derivatives of data and the return of these derivatives to zero order (normal plot).
The program was applied on experimental (trial) and reals values of su
The huge evolving in the information technologies, especially in the few last decades, has produced an increase in the volume of data on the World Wide Web, which is still growing significantly. Retrieving the relevant information on the Internet or any data source with a query created by a few words has become a big challenge. To override this, query expansion (QE) has an important function in improving the information retrieval (IR), where the original query of user is recreated to a new query by appending new related terms with the same importance. One of the problems of query expansion is the choosing of suitable terms. This problem leads to another challenge of how to retrieve the important documents with high precision, high recall
... Show MorePortable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fu
... Show More