[ANN] Introducing LLMAccess: A Simple Julia Wrapper for LLM REST APIs

I would like to share a little project I’ve been working on called LLMAccess. It’s a simple Julia package designed to simplify interactions with various Large Language Model (LLM) APIs by wrapping around their existing REST endpoints.

LLMAccess aims to provide a straightforward and unified interface for working with large language models provided by OpenAI, Anthropic, Google, Ollama, Mistral, OpenRouter, and Groq into your Julia scripts.

You can check out the project and its documentation here: https://gitlab.com/newptcai/llmaccess.jl

Features:

  • Multi-Provider Support: Seamlessly connect to multiple LLM providers using a consistent interface.
  • Image Attachments: Attach image files to your API requests to use vision models.
  • Command-Line Integration: Utilize built-in command-line argument parsing for flexible script execution.

Getting Started:

If you’re interested in giving it a try, you can try:

using Pkg
Pkg.add(url = "https://gitlab.com/newptcai/llmaccess.jl")
11 Likes

I cleaned up the code a bit. Now the package will pick default LLM provider and default model for a provider according to the following environment variables —

export DEFAULT_LLM="ollama"
export DEFAULT_OPENAI_MODEL="gpt-4o-mini"
export DEFAULT_OPENROUTER_MODEL="amazon/nova-micro-v1"
export DEFAULT_ANTHROPIC_MODEL="claude-3-5-haiku-latest"
export DEFAULT_GOOGLE_MODEL="gemini-2.0-flash"
export DEFAULT_OLLAMA_MODEL="llama3.2"
export DEFAULT_MISTRAL_MODEL="mistral-small-latest"
export DEFAULT_GROQ_MODEL="llama-3.3-70b-versatile"

If these variables are not present, the package falls back to hard coded choices.

2 Likes

Hello! I’m glad to see the ecosystem grow and users have more choices! :slight_smile:

Once you stabilise it, register and get some stars (10+), it would be great to add it to our community tracker: GitHub - svilupp/awesome-generative-ai-meets-julia-language: Comprehensive guide to generative AI projects and resources in Julia.

Btw. I’m the author of a similar package PromptingTools.jl – I’ve been thinking that maybe I should carve out some primitives (core types), so we can standardize the ecosystem. Any thoughts on that?

3 Likes

I think that would be good practice in software design. But to be honest, there is no intrinsic reason to use Julia for accessing an LLM. I am just doing this because I happen to know Julia quite well. So for me, as a hobby project, I would rather keep it as simple as possible and only add features that I myself find useful, and not try to organize things.

1 Like

That makes sense! I was thinking about standardizing Message & schema (OpenAI etc) primitives, but let’s not worry about it for now.

1 Like