[ANN] List of Awesome Generative AI in/with/for Julia Language

Generative AI and Julia, A Perfect Combination. Don’t you think?

Generative AI is moving at the speed of light and it speaks Python.
In my opinion, Julia is awesome. Wouldn’t it be a shame to miss out on the GenAI revolution?

However, there are very few GenAI projects in Julia, making it even harder to find them.

To address the issue, we have created a list “Generative AI meets Julia” (following the template of Awesome Lists).

We invite all Julia developers and enthusiasts to join us in expanding the horizons of Generative AI in Julia. Whether you have an exciting project to share or are simply looking to collaborate and learn from others, this list should be the perfect gateway to all things Julia+GenAI.

Next steps?
We believe that generative AI allows us to do more, faster, and cheaper (if used well). So the hope is to add tutorials and blog posts to the list showcasing what you can do already - who wouldn’t want to buy more free time?

12 Likes

Great idea. I did not know about the REPL integrations, and I plan to try those out in the future.

To applications/products, I’d add Codeium (which is different from Codium). It’s a free alternative to Github Copilot.

2 Likes

I have found that it’s more practical to not have a separate REPL mode (for variable interpolation, return types, etc).

I took my learnings into PromptingTools.jl, so perhaps try that as well? I’m keen for any feedback! My hope is to make it super easy to save 20 minutes every day on the little tasks…

2 Likes

Ah, cool! I totally thought it’s the same as Codium!

Do you use it with Julia? Any comparison with Copilot?

I’d add it but have no experience myself / don’t know what it’s good at or bad at. What would you say?

I think it’s identical to GitHub Copilot. I have both because Copilot is free for professors, and can’t see any major difference.

2 Likes

Can I suggest open interpreter in must-known python projects ?

2 Likes

Thanks everyone for the tips!

Also, Codeium is impressive – especially given it’s free!

2 Likes

I’m looking to build an LLM agent. Does anyone know such project exist in Julia?

Not to my knowledge, but let me know if you find one!

Some pointers below, but all my comments are for experimenting. For any production use case, I’d probably use pythoncall and some of the excellent python libraries (LangChain, LlamaIndex, etc)


What’s your use case / goal? Are you thinking of agents as “routers”/mini-decision engines OR as actors in a large simulated environment?

For the former, you could easily bootstrap an agent Struct with some memory, build some reusable templates for prompting (see aitemplates), and then simply call the LLM with aigenerate (with return_all=true to keep memory).

For the latter, I’d probably leverage the infrastructure in Agents.jl - it’s an awesome design and I think it would work very well. The primitives will save you ton of work.


Btw I wanted to add some experimental Agent primitives to PromptingTools on Thursday, eg, lazy LLM calls, so we can start chaining operations etc.

Would be keen to learn about your use case - you can find me on Slack under the same handle!


If you’re designing an interface for some library, you should check out https://agentprotocol.ai/.

Thank you for the information, especially about agents.jl. I need to check it out.

To be honest, I thought no one in the Julia community wanted to do LLM agents. There is not as much information about LLM as I had hoped. Julia is a fantastic language that has minimal barriers between package users and developers. This reduces package maintenance hurdles a lot. It would be nice if people leveraged Julia’s strengths to build something new. I’m happy to know you are in the same boat.

For my use case, I would like to build an AI sales agent, well…, more like a sales engineer I guess. Something capable of exploring a customer’s problem space and preparing possible solutions for them with whatever it has at hand. A customer’s problem could be anything from wanting to buy a gift for their dad’s birthday to a CTO looking to invest in a new IT infrastructure.

Regarding PromptingTools.jl, what do you have in mind for the package? Do you intend to build an LLM autonomous agent with it? Also, is it possible to use your package with a local model like Mistral 7b? That would be awesome!

Haha, love it! That’s exactly what I wanted to explore as well.

The only other agent I want is a “fixer”, eg, LLM gives me a code that fails, AIFixer will diagnose and correct until it runs as expected.

As for PTools, in the long run, it should be the “Core” of talking to LLM interfaces and creating templates, but all applications should be downstream of it.

Temporarily, I am working on adding some Agent and RAG primitives inside of it for convenience until I know what I want. It will all sit in some submodule “Experimental” to make it clear that it will change.

Yes, you can use Mistral models via the mistral API or via Ollama or Llama.cpp server. It should be in the README/docs.
Use the master version of PTools though, there was a bug that I fixed but haven’t released a new version yet.