From bfa8ac99240430b63145511b323bd5be1a31c6ca Mon Sep 17 00:00:00 2001 From: Andrew Zigler Date: Wed, 18 Sep 2024 12:36:41 -0700 Subject: [PATCH 1/3] Improve Copilot + local AI setup instructions --- source/configure/enable-copilot.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/configure/enable-copilot.rst b/source/configure/enable-copilot.rst index 6ad7553870f..07d8bd56671 100644 --- a/source/configure/enable-copilot.rst +++ b/source/configure/enable-copilot.rst @@ -82,7 +82,7 @@ Configure a large language model (LLM) for your Copilot integration by going to 1. Deploy your model, for example, on `Ollama `_. 2. Select **OpenAI Compatible** in the **AI Service** dropdown. - 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. + 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append `/v1` to the end of the URL. (e.g., `http://localhost:11434/v1` for Ollama) 4. If using Ollama, leave the **API Key** field blank. 5. Specify your model name in the **Default Model** field. From 1793fae296e58e618bc0e38399d1ec2c54553050 Mon Sep 17 00:00:00 2001 From: "Carrie Warner (Mattermost)" <74422101+cwarnermm@users.noreply.github.com> Date: Wed, 2 Oct 2024 16:09:19 -0400 Subject: [PATCH 2/3] Update source/configure/enable-copilot.rst --- source/configure/enable-copilot.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/configure/enable-copilot.rst b/source/configure/enable-copilot.rst index 07d8bd56671..e55095dfc3f 100644 --- a/source/configure/enable-copilot.rst +++ b/source/configure/enable-copilot.rst @@ -82,7 +82,7 @@ Configure a large language model (LLM) for your Copilot integration by going to 1. Deploy your model, for example, on `Ollama `_. 2. Select **OpenAI Compatible** in the **AI Service** dropdown. - 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append `/v1` to the end of the URL. (e.g., `http://localhost:11434/v1` for Ollama) + 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append ``/v1`` to the end of the URL. (e.g., ``http://localhost:11434/v1`` for Ollama) 4. If using Ollama, leave the **API Key** field blank. 5. Specify your model name in the **Default Model** field. From 07ab3dc64652430f97107269a7c61c56a38ffdc3 Mon Sep 17 00:00:00 2001 From: Andrew Zigler Date: Wed, 9 Oct 2024 12:03:31 -0700 Subject: [PATCH 3/3] =?UTF-8?q?=F0=9F=93=9D=20(academy-copilot-setup.rst):?= =?UTF-8?q?=20Add=20new=20file=20to=20include=20Mattermost=20Academy=20bad?= =?UTF-8?q?ge=20=F0=9F=93=9D=20(enable-copilot.rst):=20Include=20Mattermos?= =?UTF-8?q?t=20Academy=20badge=20and=20update=20API=20URL=20instructions?= =?UTF-8?q?=20The=20new=20file=20academy-copilot-setup.rst=20was=20added?= =?UTF-8?q?=20to=20include=20a=20Mattermost=20Academy=20badge=20that=20lin?= =?UTF-8?q?ks=20to=20a=20tutorial=20on=20setting=20up=20and=20configuring?= =?UTF-8?q?=20Mattermost=20Copilot=20with=20multiple=20LLMs.=20This=20badg?= =?UTF-8?q?e=20was=20then=20included=20in=20enable-copilot.rst=20to=20prov?= =?UTF-8?q?ide=20users=20with=20a=20direct=20link=20to=20the=20tutorial.?= =?UTF-8?q?=20The=20instructions=20for=20entering=20the=20API=20URL=20were?= =?UTF-8?q?=20also=20updated=20to=20clarify=20the=20URL=20format=20for=20s?= =?UTF-8?q?ervices=20other=20than=20Ollama.?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- source/_static/badges/academy-copilot-setup.rst | 12 ++++++++++++ source/configure/enable-copilot.rst | 5 ++++- 2 files changed, 16 insertions(+), 1 deletion(-) create mode 100644 source/_static/badges/academy-copilot-setup.rst diff --git a/source/_static/badges/academy-copilot-setup.rst b/source/_static/badges/academy-copilot-setup.rst new file mode 100644 index 00000000000..af6d89857a5 --- /dev/null +++ b/source/_static/badges/academy-copilot-setup.rst @@ -0,0 +1,12 @@ +:orphan: +:nosearch: + +.. raw:: html + + + +
+ Mattermost Academy + Learn about setting up and configuring Mattermost Copilot with multiple LLMs +
+
\ No newline at end of file diff --git a/source/configure/enable-copilot.rst b/source/configure/enable-copilot.rst index e55095dfc3f..ecbaa4b31aa 100644 --- a/source/configure/enable-copilot.rst +++ b/source/configure/enable-copilot.rst @@ -6,6 +6,9 @@ Enable Copilot Signficantly increase team productivity and decision-making speed by enhancing your real-time collaboration capabilities with instant access to AI-generated information, discussion summaries, and contextually-aware action recommendations with Mattermost's Copilot. Your users can interact with AI capabilities directly within their daily communication channels without needing to switch between multiple tools or platforms +.. include:: ../_static/badges/academy-copilot-setup.rst + :start-after: :nosearch: + Setup ------ @@ -82,7 +85,7 @@ Configure a large language model (LLM) for your Copilot integration by going to 1. Deploy your model, for example, on `Ollama `_. 2. Select **OpenAI Compatible** in the **AI Service** dropdown. - 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append ``/v1`` to the end of the URL. (e.g., ``http://localhost:11434/v1`` for Ollama) + 3. Enter the URL to your AI service from your Mattermost deployment in the **API URL** field. Be sure to include the port, and append ``/v1`` to the end of the URL if using Ollama. (e.g., ``http://localhost:11434/v1`` for Ollama, otherwise ``http://localhost:11434/``) 4. If using Ollama, leave the **API Key** field blank. 5. Specify your model name in the **Default Model** field.