# Taylor expansion in Symbolics

I found the very useful library

and was able to use it to generate the Taylor series coefficients for a problem I was working on.

I thought to myself this would be a really cool thing to implement in Symbolics, just for fun, and immediately got into trouble the goal: create a function to generate a Taylor series by generating the derivatives of a function and returning the coefficients of that Taylor series, and for bonus points return a function which can be evaluated in code.

As i pointed out this can all be done with the TaylorSeries package, this is a learning experience for me, and I thought it might be useful to others.

I did the obvious thing, trying to generate the 1st derivative:

``````using Symbolics

function main()
@variables x
f(x) = log(1+x)
df = Symbolics.derivative(f(x),x)
println(df)
# @syms df(x)
println(df(2))
println(df(3))
end

main()
``````

and saw:

``````(1 + x)^-1
ERROR: LoadError: Sym (1 + x)^-1 is not callable. Use @syms (1 + x)^-1(var1, var2,...) to create it as a callable.
ste code here
``````

I then tried a bunch of stuff based what i was reading in the documentation to try and get df(2) and df(3) to yield a numeric value but the best i could ever do was get “df(2)” and “df(3)” to print.

hoping i could get some advice and how to move this further along.

thank you!

Instead of `df(2)` do `substitute(df,2)`

No success.

``````println(SymbolicUtils.substitute(df,2))
``````

Closest candidates are:
` println(SymbolicUtils.substitute(df,Dict(x=>2.0)))`
My goal is to of course generate a function which is compiled in as code, but give me some time to get through the documentation 