Skip to content

Commit

Permalink
[Documentation][AMD] Add information about prebuilt ROCm vLLM docker …
Browse files Browse the repository at this point in the history
…for perf validation purpose (#12281)

Signed-off-by: Hongxia Yang <[email protected]>
  • Loading branch information
hongxiayang authored Jan 21, 2025
1 parent 69196a9 commit 09ccc9c
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions docs/source/getting_started/installation/gpu/rocm.inc.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,14 @@ vLLM supports AMD GPUs with ROCm 6.2.

Currently, there are no pre-built ROCm wheels.

However, the [AMD Infinity hub for vLLM](https://hub.docker.com/r/rocm/vllm/tags) offers a prebuilt, optimized
docker image designed for validating inference performance on the AMD Instinct™ MI300X accelerator.

```{tip}
Please check [LLM inference performance validation on AMD Instinct MI300X](https://rocm.docs.amd.com/en/latest/how-to/performance-validation/mi300x/vllm-benchmark.html)
for instructions on how to use this prebuilt docker image.
```

### Build wheel from source

0. Install prerequisites (skip if you are already in an environment/docker with the following installed):
Expand Down

0 comments on commit 09ccc9c

Please sign in to comment.