[ANN] PromptingTools.jl - Your Daily Dose of AI Efficiency!

Hey Julia enthusiasts and time hackers! :fire:

Introducing PromptingTools.jl – your new best friend for working with Large Language Models (LLMs).
Imagine saving 20 minutes every day and paying just a few pennies for it. That’s right - LLMs allow you to buy back time!

With PromptingTools.jl, you can use the OpenAI models (but also your other favorites) with no hassle. Success with LLMs is all about writing high-quality prompts. This package hopes to allow the Julia community to easily discover, share, and quickly re-use great prompts (templating is aimed for v0.2).

Key perks? Think easy-to-use aigenerate function that replaces all prompt variables through the provided kwargs, an @ai_str macro to slash those keystrokes when you have some burning questions, and light wrappers for the AI outputs to enable downstream multiple dispatch. Plus, it’s super light on dependencies, so you can put it in your global env.

So, why juggle tabs and lose focus? Keep your flow state sacred with PromptingTools.jl and never leave the REPL. It’s not about building empires; it’s about giving you back time every single day.

Next steps: More tutorials to show you how to save time!

Pssst… If you want to use the new GPT-4 Turbo that’s cheaper and faster, you can just add its alias (“gpt4t”) after the ai macro: ai"What is the capital of France?"gpt4t. There is also is asynchronous version of this macro (see README), so you can keep crunching while GPT-4 is working on your task.

EDIT: The package is not registered yet. I’ll do so only if there is interest.


PromptingTools.jl v0.2.0 Released!

Exciting news! :tada: PromptingTools.jl has hit version 0.2.0 and has been registered.

Easily add it with using Pkg; Pkg.activate(); Pkg.add(“PromptingTools”).

Or try it in a temporary REPL environment: using Pkg; Pkg.activate(temp=true); Plg.add(“PromptingTools”)

New features include:

  • Extended ai* functions: aiscan (image handling) and aiextract (structured data extraction).
  • aitemplates for prompt reusability (e.g., aitemplates(“OCR”)).
  • Support for Ollama.ai models.
  • Enhanced documentation.

Plus, check out my “GenAI Mini-Tasks” blog series for productivity tips: Forem: Buy More Time by Mastering Mini-Tasks.

Remember, use ai"<some_question> for quick inquiries. For instance, I can never remember how to activate the temp environment, so I did: ai”In Julia, how do you activate the temp environment?"

Happy coding! :rocket::woman_technologist:

PS: What I actually used was an advanced version of the above syntax:
aai"In Julia, how do you activate the temp environment?"gpt4t which taps into the latest GPT-4 Turbo and runs asynchronously.

1 Like