The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator has superior performance compared with other estimators.
The logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
Human Interactive Proofs (HIPs) are automatic inverse Turing tests, which are intended to differentiate between people and malicious computer programs. The mission of making good HIP system is a challenging issue, since the resultant HIP must be secure against attacks and in the same time it must be practical for humans. Text-based HIPs is one of the most popular HIPs types. It exploits the capability of humans to recite text images more than Optical Character Recognition (OCR), but the current text-based HIPs are not well-matched with rapid development of computer vision techniques, since they are either vey simply passed or very hard to resolve, thus this motivate that
... Show MoreThis study represents an attempt to develop a model that demonstrates the relationship between HRM Practices, Governmental Support and Organizational performance of small businesses. Furthermore, this study assay to unfold the socalled “Black Box” to clarify the ambiguous relationship between HRM practices and organizational performance by considering the pathway of logical sequence influence. The model of this study consists two parts, the first part devoted to examining the causal relationships among HRM practices, employees’ outcomes, and organizational performance. The second part assesses the direct relationship between the governmental support and organizational performance. It is hypothesized that HRM practices positively influ
... Show MoreIn this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
Abstract
The problem of missing data represents a major obstacle before researchers in the process of data analysis in different fields since , this problem is a recurrent one in all fields of study including social , medical , astronomical and clinical experiments .
The presence of such a problem within the data to be studied may influence negatively on the analysis and it may lead to misleading conclusions , together with the fact that these conclusions that result from a great bias caused by that problem in spite of the efficiency of wavelet methods but they are also affected by the missing of data , in addition to the impact of the problem of miss of accuracy estimation
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas