an open-source code interpreter for terminal users
Prerequisites
- python 3 >= 3.10
- pip
- docker (optional for Docker installation)
Firstly. Install Octopus with the octopus_up script, which will guide you through the setup process, including choosing the Model API service, installation directory, and kernel workspace directory.
curl --proto '=https' --tlsv1.2 -sSf https://up.dbpunk.xyz | sh
To install Octopus with Docker, you must have Docker installed on your local machine. Octopus uses Docker Compose to manage the kernel and agent. The octopus_up script will initialize the Octopus CLI with the generated API key from the agent.
octopus_up docker-local
To install Octopus without Docker, the kernel and agent will be installed directly to your host. This option is less secure and should only be used for testing or development.
octopus_up local
Open your terminal and execute the command octopus
, you will see the following output
Welcome to use octopus❤️ . To ask a programming question, simply type your question and press esc + enter
You can use /help to look for help
[1]🎧>
- Octopus Kernel: The code execution engine, based on notebook kernels.
- Octopus Agent: Manages client requests, uses ReAct to process complex tasks, and stores user-assembled applications.
- Octopus Terminal Cli: Accepts user requests, sends them to the Agent, and renders rich results. Currently supports Discord, iTerm2, and Kitty terminals.
For security, it is recommended to run the kernel and agent as Docker containers.
- Automatically execute AI-generated code in a Docker environment.
- Experiment feature, render images in iTerm2 and kitty.
- Upload files with the /up command and you can use the
/up
in your prompt - Experiment feature, assemble code blocks into an application and you can run the code directly by
/run
command - Support copying output to the clipboard with
/cc
command - Support prompt histories stored in the octopus cli
if you have any feature suggestion. please create a discuession to talk about it
- Improve the stability of octopus and security
- Support external codellama api service
- Support memory system
- Enhence the agent programming capability
- Enhence the kernel capability
- support gpu to accelerate processing of video
if you have any advice for the roadmap. please create a discuession to talk about it
octopus_demo.mp4
name | status | note |
---|---|---|
Openai GPT 3.5/4 | ✅ fully supported | the detail installation steps |
Azure Openai GPT 3.5/4 | ✅ fully supported | the detail install steps |
LLama.cpp Server | ✔️ supported | You must start the llama cpp server by yourself |
name | status | note |
---|---|---|
ubuntu 22.04 | ✅ fully supported | the detail installation steps |
macos | ✅ fully supported | the detail install steps |
python & bash: requirements.txt typescripts: tslab , tslab install