Hi again , I’m excited to announce OpenGithubModelsApi.jl, a new Julia client package that provides seamless integration with GitHub’s AI Models API, giving Julia developers direct access to cutting-edge AI models through a type-safe, idiomatic Julia interface.
Why OpenGithubModelsApi.jl?
While many AI frameworks exist in the Julia ecosystem, direct access to hosted AI models—especially GitHub’s growing catalog—has been challenging. OpenGithubModelsApi.jl bridges this gap by providing:
Native Julia integration with GitHub’s AI infrastructure
Type-safe API interactions that catch errors at compile time
Seamless authentication with GitHub tokens
Production-ready implementation with comprehensive validation
Key Features
Chat Completion API
Execute chat-based inference with any model available through GitHub’s AI Models API:
using OpenGithubModelsApi
# Initialize client with your GitHub token
client = GithubModelsClient("YOUR_GITHUB_TOKEN")
# Create conversation history
messages = [
Message(role="system", content="You are a helpful assistant"),
Message(role="user", content="What is Julia programming language?")
]
# Create inference request
request = InferenceRequest(
model="openai/gpt-4.1", # Format: {publisher}/{model_name}
messages=messages,
temperature=0.7
)
# Get response
response = create_chat_completion(client, request)
println(response)
Model Catalog Discovery
Explore available models programmatically:
# List all available models
models = list_models(client)
# Filter models by capability
text_models = filter(m -> "text" in m.supported_input_modalities, models)
openai_models = filter(m -> m.publisher == "openai", models)
# Print model details
for model in text_models
println("Model: $(model.name) (ID: $(model.id))")
println("Publisher: $(model.publisher)")
println("Capabilities: $(model.supported_input_modalities)")
println("-" ^ 30)
end
Organization-Specific Access
Access organization-scoped models with dedicated endpoints:
# Create organization-specific request
org_request = OrgInferenceRequest(
model="your-org/custom-model",
messages=messages,
temperature=0.5
)
# Execute in organization context
org_response = org_create_chat_completion(client, org_request, "your-org")
Advanced Configuration
Fine-tune your AI interactions with parameters like:
request = InferenceRequest(
model="meta/llama-3-70b",
messages=messages,
temperature=0.3, # More deterministic (0.0-1.0)
max_tokens=100, # Limit response length
top_p=0.9 # Nucleus sampling parameter
)
Installation
using Pkg
Pkg.add("OpenGithubModelsApi")
Why This is useful for you
-
AI Integration Made Simple: Incorporate state-of-the-art AI models directly into your Julia workflows without leaving the ecosystem.
-
Type Safety: Unlike generic HTTP clients, this package provides compile-time validation of all API parameters and responses.
-
Future-Proof: As GitHub expands its model catalog, this client will automatically support new capabilities through regular updates.
-
Julia Ecosystem Alignment: Follows Julia’s conventions for packages, documentation, and testing.
Limitations & Considerations
Token Management: You’ll need to create a GitHub personal access token with appropriate permissions.
Streaming Not Supported: The current implementation does not support streaming responses (setting
stream=true
will result in an error).
Model Availability: Access to specific models depends on GitHub’s catalog and your organization’s permissions.
Getting Started
- Create a GitHub token with appropriate permissions
- Install the package:
Pkg.add("OpenGithubModelsApi")
- Initialize the client:
client = GithubModelsClient("YOUR_TOKEN")
- Explore available models:
models = list_models(client)
- Make your first request using the examples above
Documentation & Resources
Contributing
This package was automatically generated from GitHub’s OpenAPI specification, but community contributions are welcome for:
- Improving documentation
- Adding usage examples
- Creating higher-level abstractions
- Reporting and fixing issues
Final Thoughts
OpenGithubModelsApi.jl represents an important step in connecting Julia’s high-performance computing ecosystem with the rapidly evolving world of AI models. Whether you’re building data analysis pipelines, scientific applications, or production systems, this package provides the bridge to leverage GitHub’s AI infrastructure directly from Julia.
I’d love to hear your feedback, suggestions, and use cases!
Happy hacking!