Hey Julia enthusiasts and time hackers!
Introducing PromptingTools.jl – your new best friend for working with Large Language Models (LLMs).
Imagine saving 20 minutes every day and paying just a few pennies for it. That’s right - LLMs allow you to buy back time!
With PromptingTools.jl, you can use the OpenAI models (but also your other favorites) with no hassle. Success with LLMs is all about writing high-quality prompts. This package hopes to allow the Julia community to easily discover, share, and quickly re-use great prompts (templating is aimed for v0.2).
Key perks? Think easy-to-use
aigenerate function that replaces all prompt variables through the provided kwargs, an
@ai_str macro to slash those keystrokes when you have some burning questions, and light wrappers for the AI outputs to enable downstream multiple dispatch. Plus, it’s super light on dependencies, so you can put it in your global env.
So, why juggle tabs and lose focus? Keep your flow state sacred with PromptingTools.jl and never leave the REPL. It’s not about building empires; it’s about giving you back time every single day.
Next steps: More tutorials to show you how to save time!
Pssst… If you want to use the new GPT-4 Turbo that’s cheaper and faster, you can just add its alias (“gpt4t”) after the ai macro:
ai"What is the capital of France?"gpt4t. There is also is asynchronous version of this macro (see README), so you can keep crunching while GPT-4 is working on your task.
EDIT: The package is not registered yet. I’ll do so only if there is interest.