• Aucun résultat trouvé

Sparse Bayesian binary logistic regression using the split-and-augmented Gibbs sampler

N/A
N/A
Protected

Academic year: 2021

Partager "Sparse Bayesian binary logistic regression using the split-and-augmented Gibbs sampler"

Copied!
8
0
0

Texte intégral

Loading

Figure

Table 1. Datasets considered in the experiments.
Table 3. Average number of iterations and computational time over the cross-validation procedure for the different algorithms (for the binary classification problem associated to MNIST only).
Fig. 2. MNIST one-vs-all experiment: Example of 8 handwritten digits identified as possibly missclassified by SPA (under 90%  cred-ibility intervals)

Références

Documents relatifs

The ridge logistic regression has successfully been used in text categorization problems and it has been shown to reach the same performance as the Support Vector Machine but with

The non- linear bayesian kernel regression can therefore be consid- ered as achieved online by a Sigma Point Kalman Filter.. First experiments on a cardinal sine regression show

But model selection procedures based on penalized maximum likelihood estimators in logistic regression are less studied in the literature.. In this paper we focus on model

The idea of logistic regression consists in replacing the binary loss with another similar loss function which is convex in

Keywords: linear regression, Bayesian inference, generalized least-squares, ridge regression, LASSO, model selection criterion,

To this aim we propose a parsimonious and adaptive decomposition of the coefficient function as a step function, and a model including a prior distribution that we name

The use of statistical quality control on the basis of operating charac- teristics would therefore be greatly promoted if it were possible to include, for the required number

The classification process through logistic regression measure the relationship between the categorical dependent variable (tweet label) and a fixed length words