How to use AutoGPT using Docker: A step-by-step guide

Peter Prins
3 min readApr 20, 2023

--

Photo by Markus Winkler on Unsplash

AutoGPT is an autonomous GPT-4 agent. AutoGPT is basically ChatGPT talking to itself. It can create code, execute code, and can access the internet. By talking to itself ChatGPT can, amongst other things, verify sources, create programs and debug programs by itself. It’s the latest big thing in AI. In this article I help you step-by-step to run AutoGPT using Docker.

I assume you know how to work with the terminal, you have GIT installed and you have Docker installed.

Step 1: Open your terminal

Step 2: Clone the AutoGPT git repository.
The command below creates an Auto-GPT folder in the folder you’ve accessed in your terminal.

git clone https://github.com/Significant-Gravitas/Auto-GPT.git Auto-GPT

Step 3: Checkout the stable branch
AutoGPT is under active development. The master branch may often be in a broken state. Therefore, if you just want to use AutoGPT you must checkout the stable branch. This branch containts the latest stable version of AutoGPT

cd ./Auto-GPT #move to the newly created folder
git fetch
git checkout stable

Step 4: create an .env file

cp .env.template .env

Step 5: Get an OpenAI API key
5.1
Go to: https://platform.openai.com/ .
5.2 Create an account or login with you OpenAI account.
5.3 Click on your account name (top right)
5.4 Go to “View API Keys”

OpenAI create API keys, step 1

5.5 Go to “Billing” and configure billing by following the steps “Payment methods”

OpenAI create API keys, step 2 (Billing)

5.6 After setting up payments go to API Keys and click “Create new secret key”.

OpenAI create API keys, step 3 (Creating the actual keys)

5.7 Give the key a name, and click “Create secret key” and copy the key.

Step 6: Save the key in the .env file
Open the .env file in the Auto-GPT directory. Where it says OPEN_API_KEY=, paste your API key

OPENAI_API_KEY=#paste your API key here

Step 7: Start Docker

Step 8: Run Auto-GPT with docker-compose.
Execute the following command in your terminal

docker-compose run --build --rm auto-gpt

Step 9: Enjoy AutoGPT!
Tip: Run AutoGPT in continuous mode:

docker-compose run --build --rm auto-gpt --continuous

Step 10: If you want to stay up to date on how AI can help you develop software, follow me.

Update: use local memory with Auto-GPT

If you want to use local memory with auto-gpt (and not use Redis). Do the following:

Step 1: Create a file named “auto-gpt.json” in the “Auto-GPT” directory.

Step 2: Open the file “docker-compose.yml” in your editor.

Step 3: Bind auto-gpt.json to the docker container
In the ‘docker-compose.yml’ file add the following to “volumes:”

- "./auto-gpt.json:/home/appuser/auto-gpt.json"

You end up with a “volumes” section that looks like this:

volumes:
- "./autogpt:/app"
- ".env:/app/.env"
- "./auto-gpt.json:/home/appuser/auto-gpt.json"

--

--

Peter Prins
Peter Prins

Written by Peter Prins

Nerd Entrepreneur who loves to improve peoples’ lives with software. Passionate about software architecture and usability. Shares knowledge to help others.

Responses (3)