Constant changes when optimising ODE parameters

Hi! I’m trying to use Julia for optimizing parameters for an ODE system, but I’m running into the issue that the value C1 that should be constant is changed during the optimization. I currently have the following code (for a toy problem):

using Zygote, DifferentialEquations, SciMLSensitivity

N=10

# included state variables (dummy)
const C1 = vec(float.(rand(N,1).>0.2))

# activation function
act(x) = x./(1 .+ x)

# ODE definition
function ode_fun(du,u,p,t)
    K = zeros(N,N)
    K[1:end] .= p
    
    du.= C1.*act(K*u)-u
end

# loss function
function loss(x)
    prob = ODEProblem(ode_fun,x0,(0.0f0,1.0f0),abs.(x))
    sol = solve(prob)
    u = reduce(hcat,sol.u)
    return sum(abs2,u[:,end]-y) + sum(abs.(x))
end

# number of iterations for optimisation
iters = 100

# set initial condition, initial guess and desired output after simulation
global x0 = rand(10)
global x = rand(100)
y = rand(N,1)

loss_vec =[]

# gradient descent
for _ in 1:iters
    grd = Zygote.withgradient(x) do x
        loss(x)
    end
    push!(loss_vec,grd.val)
    global x = x - 0.01*grd.grad[1]
end

When running the code and printing C1 in ode_fun, I find that C1 is changed to a new value after the first call of ode_fun. Does anybody know why this is happening? I just cannot seem to find how C1 is changed at all in the code.

I’m running your code and C1 looks constant. Can you remove the randomness and show a case with constant values where this occurs?

Thanks for taking the time to run my code.

Upon rerunning the code, it seems that C1 now stays constant, both when removing randomness and with randomness, each in a clean workspace. I am very confused, I tested the code on two machines before posting and the behavior persisted. It’s good that the code now runs as intended, but I could not say what changed between yesterday and today.

I did find another workaround by passing C1 as an argument to the loss function.