You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def_prediction_feature_weights(booster, dmatrix, n_targets,
feature_names, xgb_feature_names):
""" For each target, return score and numpy array with feature weights on this prediction, following an idea from http://blog.datadive.net/interpreting-random-forests/ """# XGBClassifier does not have pred_leaf argument, so use boosterleaf_ids, =booster.predict(dmatrix, pred_leaf=True)
booster.predict(dmatrix, pred_leaf=True) will return a 1-d array when only one tree in xgboost.
this may be modified to:
def_prediction_feature_weights(booster, dmatrix, n_targets,
feature_names, xgb_feature_names):
""" For each target, return score and numpy array with feature weights on this prediction, following an idea from http://blog.datadive.net/interpreting-random-forests/ """# XGBClassifier does not have pred_leaf argument, so use boosterleaf_ids, =booster.predict(dmatrix, pred_leaf=True).reshape(1,-1)
Description: XGBClassifier explainer failed when n_estimators=1.(extreme situation)
E.g.
model_t = XGBClassifier(random_state=1111, max_depth=4, n_estimators=1)
show_prediction(model, test_input[tougue_correct_q[0]]) will failed to run.
The error message is as follows:
The text was updated successfully, but these errors were encountered: