how can you get the feature importance of a multi-label classification problem?

I would say that the solution within the referenced post is not outdated; instead, you have a slightly different setting to take care of.

The estimator that you’re passing to OneVsRestClassifier is a Pipeline; in the referenced post it was a RandomForestClassifier directly.

Therefore you’ll have to access one of the pipeline’s steps to get to the RandomForestClassifier instance on which you’ll be finally able to access the feature_importances_ attribute. That’s one way of proceeding:


Eventually, be aware that you’ll have to fit your OneVsRestClassifier instance to be able to access its estimators_ attribute. Indeed, though cross_val_predict already takes care of fitting the estimator as you might see here, cross_val_predict does not return the estimator instance, as .fit() method does. Therefore, outside cross_val_predict the fact that classifier was fit is not known, reason why you’re not able to access the estimators_ attribute.

Here is a toy example:

from sklearn import datasets
from sklearn.ensemble import RandomForestClassifier
from sklearn.multiclass import OneVsRestClassifier
from sklearn.model_selection import train_test_split
from sklearn.pipeline import make_pipeline
from sklearn.model_selection import cross_val_predict

iris = datasets.load_iris()
X =
y =

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, stratify=y, random_state=0)

classifier = OneVsRestClassifier(
), y_train)
y_train_pred = cross_val_predict(classifier, X_train, y_train, cv=3) 

CLICK HERE to find out more related problems solutions.

Leave a Comment

Your email address will not be published.

Scroll to Top