Julia response to Auto-GPT

Now that we have a wrapper for the openAI API it would be pretty cool to have an Auto-GPT like package that the julia community could contribute to. It’s pretty amazing how simply having the LLM talk to itself(s) can greatly increase the skill and complexity of tasks that it can tackle. Maybe someone is already working on such a package.

Actually you don’t need a package to do that, you can just use langchain with PyCall.
On langchain website you’ve an implementation example of auto-gpt using langchain primitives: AutoGPT — 🦜🔗 LangChain 0.0.161

Python code “autogpt.py”:

from langchain.utilities import SerpAPIWrapper
from langchain.agents import Tool
from langchain.tools.file_management.write import WriteFileTool
from langchain.tools.file_management.read import ReadFileTool

search = SerpAPIWrapper()
tools = [
    Tool(
        name = "search",
        func=search.run,
        description="useful for when you need to answer questions about current events. You should ask targeted questions"
    ),
    WriteFileTool(),
    ReadFileTool(),
    ]
from langchain.vectorstores import FAISS
from langchain.docstore import InMemoryDocstore
from langchain.embeddings import OpenAIEmbeddings
# Define your embedding model
embeddings_model = OpenAIEmbeddings()
# Initialize the vectorstore as empty
import faiss

embedding_size = 1536
index = faiss.IndexFlatL2(embedding_size)
vectorstore = FAISS(embeddings_model.embed_query, index, InMemoryDocstore({}), {})
from langchain.experimental import AutoGPT
from langchain.chat_models import ChatOpenAI
agent = AutoGPT.from_llm_and_tools(
    ai_name="Tom",
    ai_role="Assistant",
    tools=tools,
    llm=ChatOpenAI(temperature=0),
    memory=vectorstore.as_retriever()
)
# Set verbose to be true
agent.chain.verbose = True

agent.run(["write a weather report for SF today"])

Julia equivalent with PyCall “autogpt.jl”:

using PyCall

utilities = pyimport("langchain.utilities")
agents = pyimport("langchain.agents")
write = pyimport("langchain.tools.file_management.write")
read = pyimport("langchain.tools.file_management.read")
vectorstores = pyimport("langchain.vectorstores")
docstore = pyimport("langchain.docstore")
embeddings = pyimport("langchain.embeddings")
experimental = pyimport("langchain.experimental")
chat_models = pyimport("langchain.chat_models")
faiss = pyimport("faiss")

search = utilities.SerpAPIWrapper()
tools = [
    agents.Tool(
        name = "search",
        func=search.run,
        description="useful for when you need to answer questions about current events. You should ask targeted questions"
    ),
    write.WriteFileTool(),
    read.ReadFileTool(),
]

embeddings_model = embeddings.OpenAIEmbeddings()
embedding_size = 1536
index = faiss.IndexFlatL2(embedding_size)
vectorstore = vectorstores.FAISS(embeddings_model.embed_query, index, docstore.InMemoryDocstore(Dict()), Dict())

agent = experimental.AutoGPT.from_llm_and_tools(
    ai_name="Tom",
    ai_role="Assistant",
    tools=tools,
    llm=chat_models.ChatOpenAI(temperature=0),
    memory=vectorstore.as_retriever()
)

agent.chain.verbose = true

agent.run(["write a weather report for SF today"])

To use this code you need to specify OPENAI_API_KEY and SERPAPI_API_KEY environment variables.
example in a shell:

OPENAI_API_KEY="YOUR_OPENAI_API_KEY" SERPAPI_API_KEY="YOUR_SERPAPI_API_KEY" julia autogpt.jl

Have fun !

5 Likes

This is awesome, thanks!

2 Likes