Skip to content

Commit

Permalink
new docs
Browse files Browse the repository at this point in the history
  • Loading branch information
n4ze3m committed Oct 20, 2023
1 parent e88c19a commit c42c7f0
Show file tree
Hide file tree
Showing 2 changed files with 45 additions and 2 deletions.
8 changes: 6 additions & 2 deletions docs/.vitepress/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,8 @@ export default defineConfig({
{ text: "Home", link: "/" },
{ text: "Guide", link: "/guide/what-is-dialoqbase" },
{
text: "AI Providers",
link: "/guide/ai-providers/openai",
text: "Use Local AI Models",
link: "/guide/localai-model",
},
{ text: "Self Hosting", link: "/guide/self-hosting" },
],
Expand Down Expand Up @@ -132,6 +132,10 @@ export default defineConfig({
text: "AI Providers",
collapsed: false,
items: [
{
text: "Use Local AI Models",
link: "/guide/localai-model",
},
{
text: "OpenAI",
link: "/guide/ai-providers/openai",
Expand Down
39 changes: 39 additions & 0 deletions docs/guide/localai-model.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Support for Custom AI Models

## Introduction

Starting from version `1.2.0`, our platform provides support for local AI models which are compatible with the OpenAI API. A notable instance for such local AI deployment is the [LocalAI](https://github.com/go-skynet/LocalAI/) project.

This documentation will guide you through the steps needed to integrate a custom model deployed via LocalAI into our system.

## Prerequisites

1. Ensure you've already set up your local AI model using LocalAI or any other platform that offers OpenAI API compatibility.
2. Admin access to our platform is required to add new models.

## Integration Steps

### 1. Login as Administrator

Ensure that you're logged in with administrator rights to have the permissions necessary for this process.

### 2. Navigate to Model Settings

- Go to `Settings`.
- From the side menu, select `Models`.
- Click on `Add Models`.

### 3. Add Your Local or Custom Model URL

You'll be presented with a form asking for details about the model you wish to integrate:

- **URL**: Here, you need to provide the endpoint of your locally deployed AI model.
- If you're using LocalAI and it's running on its default port (8080), you would enter `http://localhost:8080/v1`.
- **Model Selection**: Choose the model you want to use from your local AI instance.
- **Name**: Give your model a recognizable name.

Click `Save` to finalize adding your custom model.

## Conclusion

Once integrated, you can now utilize your custom model in the same way you'd use any other model in the platform. This allows for greater flexibility, especially when working in environments that may benefit from localized data processing, without the need to send data over the internet.

0 comments on commit c42c7f0

Please sign in to comment.