Help building a PID example

Hi guys!

I am trying to use Flux.jl to build a PIDNN as shown here: https://molefrog.github.io/pidnn-talk/

I started to code using only the proportional part. This, I need an input layer with 2 inputs and 1 output, and one output layer with 1 input and 1 output. This is my code so far:

struct Input
    W
end
 
Input() = Input(param(randn(2)))
 
(m::Input)(x) = m.W[1]*x[1] + m.W[2]*x[2]
 
struct P
    W
end
 
P() = P(param(randn()))
 
function (m::P)(x)
    # P
    if (x > 1)
        return m.W[1]
    elseif (x < -1)
        return -m.W[1]
    else
        return x*m.W[1]
    end
end
 
m = Chain(Input(), P())
 
function loss(x,y)
    # Simulate the system.
    t = 0:0.1:100
 
    o = x
 
    Δ =  0.0
 
    for k in t
        Δ = Δ + (y-o)^2
        r = m([y;o])
        o = o + r*0.1
    end
 
    return Δ/length(t)
end
 
ps = Flux.params(m)
Tracker.gradient(()->loss(0,1), ps)

Which produces the following error:

julia> Tracker.gradient(()->loss(0,1), ps)
ERROR: MethodError: Cannot `convert` an object of type Array{Float64,1} to an object of type Float64

Can anyone please help me?

Btw, if the input layer only have one input, then it works fine.

The model only has one input, the input data, you are using the input and output.
It should be something like this:

r = m(o)

I think “y” is Float and “o” is a Vector:

(y-o)^2

In the loss function it is not clear who is the system to be controlled and I think there are errors in the controller, I think it should be something like this.

for k in t
       control_output = Tracker.data(m(o)) #control output
       out = model(control_output)  #model output
       Δ = Δ + (y - out)^2 # squared error (desired output - model output)
       o = [y, out] # update next input 
    
end

Obs: Vi que é brasileiro :slight_smile:, tem um grupo no telegram de usuários de julia no Brasil caso interesse. (https://t.me/juliabrasil)

Hi @phelipe !

Thanks! With the help of some people on Slack, I managed to finally tune my PID Neural Network. I will write a tutorial on my website, something like a Hello World on machine learning using Julia for Control engineers :smiley:

P.S.: Já me adicionei!

1 Like

Hi @Ronis_BR,

Thanks for posting this question. It doesn’t look like you’ve had time to update your blog yet (or at least, I don’t see anything at https://www.ronanarraes.com/blog/). Beyond what @phelipe suggested above, did you have to do much to get a working system? How well did it work?

Thanks,
Kevin

2 Likes

Hi,

Indeed, I did not have time to write the text yet, sorry :frowning: I decided to spend some time finishing the package PrettyTables.jl and trying to release an initial version of TextUserIntefaces.jl.

But it worked well after some modifications based on @phelipe advices. I currently have two students working on it and we managed to tune a PIDNN using Flux.jl. I will try to write the text in my blog soon about it :slight_smile:

2 Likes

Hi, I am wondering if you have written up your package yet? I am looking for something similar to tune a temperature and humidity PID controller in a Greenhouse(GH). I have a full numerical model of my GH working in Julia. Thanks

Hi @Peter-Jolly

Yes, we could apply a PIDNN to control the satellite attitude. I should write a blog post shortly about it. However, the solution is very simple, a package to implement a PIDNN seems an overkill. We you need more details, please, let me know.

Can I send you something by email? My email is peter_jolly21@hotmail.com

Yes, sure, I sent you an email.

Thanks. I received and sent you a reply.

Tuning a PID controller for a transfer function (that is, a single-input-single-output system modelled as linear) using neural network? Isn’t that an overkill?

1 Like

Hi @skela96 ,

What I did was to replace the PID with a Neural controller. As @zdenek_hurak said, tuning a PID using a neural network seems an overkill. What you can do (although I do not know if this is possible) is using Flux.jl back propagation to create an optimization framework to find the gains as they were the neural network weights.