-
Notifications
You must be signed in to change notification settings - Fork 895
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Undefined reference to `MPI::Comm::Comm()' #261
Comments
What docker image do you use? |
I don't use Docker. If it is so mandatory, recommend me an image :) |
Docker is not necessary. But it is helpful to setup the environment. I guess the problem you encounter is you don't install the MPI, or the make file cannot find the MPI successfully. We have recommend some docker images in the guides of different models. You can choose one by your requirement. |
Thank you. I'll try to run it in the docker. |
I got this when I installed the docker from the tutorial https://github.com/NVIDIA/FasterTransformer/blob/main/docs/gptj_guide.md |
ok, your driver and gpu are both too old to use the docker. Try to build the repo in your previous environment, but add |
@byshiue I need a gpt-j inference with two tesla k80s. Is this even possible? |
You can try the suggestion in this issue #69. |
Besides, you can try to add |
Code compiled with the docker. I will try running gpt |
Close this bug because it is inactivated. Feel free to re-open this issue if you still have any problem. |
I had the same problem in a non-docker environment too. Adding |
Got the same problem, I installed openmpi==4.0.2 from anaconda |
Description
Tesla K80. Cuda 11.3. CudNN 8.2.
Reproduced Steps
The text was updated successfully, but these errors were encountered: