Skip to content

Commit

Permalink
[Misc] Improve the readability of BNB error messages (#12320)
Browse files Browse the repository at this point in the history
Signed-off-by: Jee Jee Li <[email protected]>
  • Loading branch information
jeejeelee authored Jan 22, 2025
1 parent fc66dee commit 84bee4b
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions vllm/model_executor/model_loader/loader.py
Original file line number Diff line number Diff line change
Expand Up @@ -1076,8 +1076,8 @@ def _load_weights(self, model_config: ModelConfig,
# weight tensor. So TP does not work with pre_quantized bnb models.
if pre_quant and get_tensor_model_parallel_world_size() > 1:
raise ValueError(
"Prequant BitsAndBytes models with TP is not supported."
"Please try with PP.")
"Prequant BitsAndBytes models with tensor parallelism is not "
"supported. Please try with pipeline parallelism.")

load_8bit = False
if pre_quant:
Expand Down

0 comments on commit 84bee4b

Please sign in to comment.