Discover AIHelpMe, the new Julia package that turns your docstrings into a treasure trove of insights!
Why AIHelpMe?
Bridging the gap between local knowledge and AI smarts, AIHelpMe brings personalized, AI-enhanced guidance straight from your code’s docs. And enables an unprecedented level of control and insight behind each answer.
Easy Start
Add AIHelpMe (it’s not registered yet), grab your Cohere and OpenAI keys (tiny costs!), and dive into a smarter coding experience.
using AIHelpMe
aihelp"How to create a named tuple from a dictionary?"
Add it with:
using Pkg; Pkg.add("https://github.com/svilupp/AIHelpMe.jl")
Help us fine-tune AIHelpMe in its early stages - your feedback is the key to our shared success!
(And it will determine whether we develop it further…)
I will note that the first example in the Forem blog post the AI gets that answer wrong:
[ Info: Done generating response. Total cost: \$0.001
AIMessage("To create a named tuple from a dictionary, you can use the `NamedTuple` constructor and provide the dictionary's key-value pairs as arguments using the `name=value` syntax. Here's an example:
d = Dict("a" => 1, "b" => 2)
nt = NamedTuple(d)
In this example, the dictionary `d` is converted into a named tuple `nt` using the `NamedTuple` constructor. Each key-value pair in the dictionary becomes a named field in the named tuple.")
If you run that code you will get an error:
julia> d = Dict("a" => 1, "b" => 2)
Dict{String, Int64} with 2 entries:
"b" => 2
"a" => 1
julia> nt = NamedTuple(d)
ERROR: TypeError: in typeassert, expected Symbol, got a value of type String
Stacktrace:
[1] merge(a::@NamedTuple{}, itr::Dict{String, Int64})
@ Base ./namedtuple.jl:365
[2] NamedTuple(itr::Dict{String, Int64})
@ Base ./namedtuple.jl:151
[3] top-level scope
@ REPL[2]:1
Good catch. It’s a perfect example of what working with GenAI is like – working with a new intern
It’s because the default model (for cost and speed purposes) is GPT 3.5 T from June, it’s much weaker than the November version (1106, aliased as “gpt3t”).
That version would get it right (notice the “gpt3t” at the end):
aihelp"In Julia, how to create a named tuple from a dictionary? Give me an example"gpt3t
[ Info: Done generating response. Total cost: $0.002
AIMessage("You can use the splatting operator to create a named tuple from a dictionary in Julia. Here's an example:
d = Dict(:a => 1, :b => 2)
nt = (; d...)
But even GPT3.5 can recover if we let if fix it’s mistakes, eg,
julia> aihelp"Help me fix this error when creating a named tuple from Dictionary: $err. Give me example that works"
[ Info: Done generating response. Total cost: $0.002
AIMessage("The error message indicates that the `Dict` contains keys of type `String`, but `NamedTuple` expects keys of type `Symbol`. You can fix this error by converting the keys of the `Dict` to symbols using the `Symbol()` function.
Here's an example that demonstrates how to create a `NamedTuple` from a `Dict`:
```julia
dict = Dict("a" => 1, "b" => 2, "c" => 3)
named_tuple = NamedTuple(Symbol(k) => v for (k, v) in dict)
This code converts the keys of the Dict to symbols and then constructs the NamedTuple using a comprehension.
Or just try it out yourself - install with ]AIHelpMe and then aihelp"<some question you have".
Or more advanced usage with highlighting of the answer:
using AIHelpMe
using AIHelpMe: pprint, last_result
# ideally, switch to better pipeline for proper results, requires setting up Cohere API key
AIHelpMe.update_pipeline!(:silver)
# load tidier index, others available: :julia, :makie
AIHelpMe.load_index!(:tidier);
# Ask a question
aihelp"How do you add a regression line to a plot in TidierPlots?"
It should work without a problem for model_chat, but it will NOT work with changing the model_embedding for the preprocessed knowledge (see the guide here: Advanced | AIHelpMe.jl).
The magic of AIHelpMe requires to either
use pre-processed knowledge bundles (julia, make, tidier,…)
preprocess knowledge bundles yourself
The reason is that we MUST use the same EMBEDDING model during knowledge preparation and during answering questions (see Introduction | AIHelpMe.jl).
If you want to use only Mistral models, you can simply embed all knowledge that’s accessible in your opened Julia session (see Advanced | AIHelpMe.jl - notice the update_index() call).
If you’re having problems with changing only the model_chat to Mistral, please open an issue with the reproducible example and the error you’re getting.