diff --git a/docs/docs/examples/chatbox.md b/docs/docs/examples/chatbox.md index 963cfe006..2ea69e0d9 100644 --- a/docs/docs/examples/chatbox.md +++ b/docs/docs/examples/chatbox.md @@ -2,10 +2,50 @@ title: Nitro with Chatbox --- -:::info COMING SOON -::: +This guide demonstrates how to integrate Nitro with Chatbox, showcasing the compatibility of Nitro with various platforms. - \ No newline at end of file +For more information, please visit the [Chatbox official GitHub page](https://github.com/Bin-Huang/chatbox). + + +## Downloading and Installing Chatbox + +To download and install Chatbox, follow the instructions available at this [link](https://github.com/Bin-Huang/chatbox#download). + +## Using Nitro as a Backend + +1. Start Nitro server + +Open your command line tool and enter: +``` +nitro +``` + +> Ensure you are using the latest version of [Nitro](new/install.md) + +2. Run the Model + +To load the model, use the following command: + +``` +curl http://localhost:3928/inferences/llamacpp/loadmodel \ + -H 'Content-Type: application/json' \ + -d '{ + "llama_model_path": "model/llama-2-7b-chat.Q5_K_M.gguf", + "ctx_len": 512, + "ngl": 100, + }' +``` + +3. Config chatbox +Adjust the `settings` in Chatbox to connect with Nitro. Change your settings to match the configuration shown in the image below: + +![Settings](img/chatbox.PNG) + +4. Chat with the Model + +Once the setup is complete, you can start chatting with the model using Chatbox. All functions of Chatbox are now enabled with Nitro as the backend. + +## Video demo \ No newline at end of file diff --git a/docs/docs/examples/img/chatbox.PNG b/docs/docs/examples/img/chatbox.PNG new file mode 100644 index 000000000..eeae517b6 Binary files /dev/null and b/docs/docs/examples/img/chatbox.PNG differ