• Aucun résultat trouvé

Forward and Backward Feature Selection for Query Performance Prediction

N/A
N/A
Protected

Academic year: 2021

Partager "Forward and Backward Feature Selection for Query Performance Prediction"

Copied!
10
0
0

Texte intégral

Loading

Figure

Figure 1: The four first steps of AIC stepwise model selection when starting with 8 variables
Table 2: Correlation of predicted and ground-truth effectiveness for scores with Pearson (r) and for ranks with Spearman (ρ).

Références

Documents relatifs

We also considered mice: we applied logistic regression with a classical forward selection method, with the BIC calculated on each imputed data set. However, note that there is

Reduced animal model for marker assisted selection using best linear unbiased prediction... Original

Comparison shows that the performance of the random forest algorithm is better than that of the decision tree algorithm, and the improved feature selection algorithm can

Unsupervised feature selection is mostly assessed along a supervised learning setting, depending on whether the selected features ef- ficiently permit to predict the (unknown)

Actually, the proof is quite direct now: instead of using the techniques given in the section devoted to the inductive case, we use a result valid in the transductive case and

We report the performance of (i) a FastText model trained on the training subsed of the data set of 1,000 head- lines, (ii) an EC classification pipeline based on Word2Vec and

This paper proposes a feature selection method for support vector regression when applied to predict building’s energy consumption.. The features are selected according to

We start with the single model aggregation strategies and evaluate them, over a panel of datasets and feature selec- tion methods, both in terms of the stability of the aggre-