Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use_existing_torch.py: filter out comments #66

Merged
merged 2 commits into from
Jan 21, 2025
Merged

Conversation

tdoublep
Copy link
Member

@tdoublep tdoublep commented Jan 21, 2025

See upstream PR: vllm-project/vllm#12255

Would like to merge the fix here asap though because it is breaking our images.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

Signed-off-by: Thomas Parnell <[email protected]>
Copy link
Contributor

@jvlunteren jvlunteren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

@tdoublep tdoublep merged commit 30177f1 into main Jan 21, 2025
10 checks passed
@tdoublep tdoublep deleted the tpa-fix-existing-torch branch January 21, 2025 09:21
tdoublep added a commit that referenced this pull request Jan 21, 2025
…ncies (#68)

I already tried to fix this using #66
but upstream didn't like that change (the behaviour to filter out
comments containing torch was intentional). After [some
discussion](vllm-project/vllm#12255), we agreed
on a different solution implemented in this PR. Note that I reverted the
changes from #66 by force pushing main.

Note this has already been merged upstream by
vllm-project/vllm#12260 but I'm cherry-picking
the fix here since it is blocking the CI builds.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants