I’m new to Flux and I’m trying to compute the hessian.

Here’s the code:

```
grad = Flux.gradient(() -> loss(x, y), Flux.params(model))
hess = Flux.gradient(grad, Flux.params(model))
```

But when running the line to compute hessian, it’s throwing the following error

**MethodError: objects of type Tracker.Grads are not callable**

Did you see the keyword nest or allownest = true?

1 Like

Nope. But I just tried adding `nest=true`

to the grad and still it’s showing the same error.

You are trying to take the gradient of an object, not a function. Create a function that takes the gradient and take the gradient of that

```
grad = ()->Flux.gradient(() -> loss(x, y), Flux.params(model))
hess = Flux.gradient(grad, Flux.params(model))
```

1 Like

Your suggestion solved the issue partially. But after that I was getting the error **outputs are not scalar**

The bellow code is working, but I’m not sure if what I’m doing is correct.

```
grad() = Tracker.gradient(() -> loss(x, y), Flux.params(model), nest = true)
hess = forward(grad,Flux.params(model))
```

Thanks

1 Like

You can also try the built in function

Tried that and got the following error

**ERROR:** MethodError: no method matching jacobian(::TrackedArray{…,Array{Float64,2}}, ::Array{Float64,2}, ::Array{Float64,2})

Hessian is the Jacobian of the gradient, and it’s already in the library:

Got this error while trying that…

**ERROR:** MethodError: no method matching jacobian(::TrackedArray{…,Array{Float64,2}}, ::Array{Float64,2}, ::Array{Float64,2})

[/quote]

you need to be computing the Jacobian of a vector function on a vector.

But I think I have defined a generic function

```
loss(x,y) = -sum(y.*log.(model(x)) .- (1 .- y).*log.(1 .- model(x))) * 1 // size(y, 2)
```

This is my loss function.

I also tried the loss functions that came with Flux and the ones defined in the Model zoo. No luck!

@denizyuret Is it possible to compute hessian in Knet ? If so, how to do it?

Thanks in advance