Hello all,

I’ve run into an issue with JuMP function node_count(). This function seems to check the type of value returned by the solver and expects it to be Int64, however with some solvers this isn’t the case. For example Gurobi gives it as a Float64, and Cbc as an Int32. I give below a small example of a simple knapsack problem which should reproduce the error.

```
using JuMP
using Gurobi
m = Model(Gurobi.Optimizer)
@variable(m, x[1:4], Bin)
@constraint(m, 3*x[1] + 4*x[2] + 6*x[3] + 5*x[4] <= 8)
@objective(m, Max, 2*x[1] + 3*x[2] + x[3] + 4*x[4])
optimize!(m)
node_count(m)
```

This should yield the following error :

```
ERROR: TypeError: in typeassert, expected Int64, got a value of type Float64
Stacktrace:
[1] get(model::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Gurobi.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, attr::MathOptInterface.NodeCount)
@ MathOptInterface.Utilities C:\Users\Verchere\.julia\packages\MathOptInterface\QxT5e\src\Utilities\cachingoptimizer.jl:803
[2] _moi_get_result(model::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Gurobi.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, args::MathOptInterface.NodeCount)
@ JuMP C:\Users\Verchere\.julia\packages\JuMP\lnUbA\src\JuMP.jl:1226
[3] get(model::Model, attr::MathOptInterface.NodeCount)
@ JuMP C:\Users\Verchere\.julia\packages\JuMP\lnUbA\src\JuMP.jl:1251
[4] node_count(model::Model)
@ JuMP C:\Users\Verchere\.julia\packages\JuMP\lnUbA\src\JuMP.jl:1071
[5] top-level scope
@ none:1
```

As far as I know I am up to date in terms of versions - Julia 1.7.1, JuMP 0.22.2, Gurobi.jl 0.10.1.

I am not very familiar with the architecture and implementation of JuMP and MOI, but if it helps please know that using MOI.get(m,MOI.NodeCount()) yields the same error.

Thank you for reading !