Passing parameters and performances

Hi,

I am writing a simulator for spiking neural network.
In the simulator I would like to use a loop like the one in the code


Params = Dict{String,Union{Int,Bool}}

function sim_loop!(synapse_params::Dict{String,Float64},neuron_params::Params)
    sim_length = neuron_params["sim_length"]
    record = neuron_params["record"]
    net_size   = neuron_params["net_size"]
    neurons = round.(Int,sign.(rand(net_size).-0.5))
         # println(neurons)
    for time in 1:sim_length
         # neurons = update_neurons(neurons, synapse_params)
         update_neurons!(neurons, synapse_params)
         if record && time%1000==0
         # println(neurons)
        end
    end
end

function update_neurons!(neurons,  synapse_params)
    weight = synapse_params["weight"]
    for i in length(neurons)
    neurons[i] = 0 < sum(transpose(neurons)*weight.*(rand(length(neurons)).-0.40)) ? 1 : 0
    end
end

function get_params()
    ##synapse params
    (Dict{String,Float64}(
    "weight" => .0001,
    "other" => 1.
    ),
    #Neuron params
    Params(
    "sim_length" => 1000,
    "net_size" => 100,
    "record" => true,
    )
    )
end

using Random
synapse_params, neuron_params = get_params()

Random.seed!(0)
@time for _ in 1:100
    sim_loop!(synapse_params,neuron_params)
end

Note that this is not the original code, I wrote a proof of concept.
I am interested in these aspects:

  1. If I have parameters that must be passed into the simulations and these parameters are in a dictionary, how am I expected to do in a Julian way? I thought that unpacking the dictionary is the best way because the function can compile with a type-defined variable and does not look up in the dictionary every time, but it is very verbose. Are there other options?

  2. Some of these parameters dictionaries are composed by String, Bools and Floats, is there a pay-off in using Union like I do, if the Union get unpacked to type-stable variable before the loop is still ok?

By the way, I am new to Julia, used to C++ and Python, if you have any profound suggestion how to write the code in a more efficient way I am glad to hear :slight_smile:

As for the performance of the types retrieved from the Dict: you could annotate what you retrieve in your code as in

sim_length = neuron_params["sim_length"] :: Int64

or

sim_length :: Int64 = neuraon_params["sim_length"]

That should take care of type inference. I am not sure how those two differ in performance.

As I understand it, accessing Dicts is fairly fast, but you can profile whether this is a bottleneck in your case.

Perhaps the bigger picture question is: why pass the parameters as a Dict in the first place? I don’t know the details of your application, but don’t you know from the outset what the parameters and their types will be? Then it would seem cleaner to make a parameter struct.

In general, you should not have to annotate the types of variables in your function. The compiler is pretty good about inferring the types as long as the type does not change. In fact, excessive type declarations will make your code less flexible and generic.

Depending on your the requirements of your project, I also recommend structs or mutable structs. In my experience, they are often faster than dictionaries (not drastically so in this case however). In addition, they tend to be easier to use with multiple dispatch. For structs, you will want to declare the types or use parametric types if the types are not always the same for each instance. Here is an example:

struct Neuron 
    sim_length::Int64 
    net_size::Int64 
    record::Bool
end 

struct Synapse
    weight::Float64 
    other::Float64
end

I recommend Parameters.jl if you need to extract lots of variables from a struct. It also provides a utility to set default values in the body of the struct declaration. You can also use standard Julia to accomplish the same thing: mystruct(;a=1,b=2.0) = mystruct(a,b).

@code_warntype is a good tool for identifying type instabilities in your code.

6 Likes

Hi Christopher,

Thanks for the answer, very clear.
For some reason I preferred to use the dictionary in the dictionaries in this case.
I create a function that is run only once, and I wonder if I can do better.

This doubt arise from @code_warntype, and from previous experience that a similar modification to a fast code slowed it down, a lot.
the code_warntype assess there are several ::Any node in the AST.

Thanks,
Alessio

FloatParams = Dict{String, Float64}

    network_pars = FloatParams()
    network_pars["partExc"] = .8   #part of Ncells
    network_pars["partInh"] = 0.25 #part of Ne
    network_pars["Ne"] = ceil(Int,network_pars["partExc"]*sim_pars["Ncells"])
    network_pars["Ni"] = ceil(Int,network_pars["partInh"]*network_pars["Ne"])

    network_pars["jee0"] = 2.86 #initial ee strength
    network_pars["jei0"] = 80.0#48.7 #80.0#48.7 #initial ei strength
    network_pars["jie"]  = 1.27 #ie strength (not plastic)
    network_pars["jii"]  = 16.2 #ii strength (not plastic)
    network_pars["p"]    = .2  #connection density


function create_weights(Ncells,network_pars)::Weights
    Ne = round(Int64,network_pars["Ne"])
    weights = zeros(Float64,Ncells,Ncells)
    weights[1:Ne,1:Ne] .= network_pars["jee0"]
    weights[1:Ne,(1+Ne):Ncells] .= network_pars["jie"]
    weights[(1+Ne):Ncells,1:Ne] .= network_pars["jei0"]
    weights[(1+Ne):Ncells,(1+Ne):Ncells] .= network_pars["jii"]
    weights = weights.*(rand(Ncells,Ncells) .< network_pars["p"])

    nzforEtoAll  = Vector{Vector{Int64}}([findall(weights[nn,:].!=0) for nn = 1:Ne])         # for E neurons collect all postsynaptic neurons
    nzforItoAll  = Vector{Vector{Int64}}([findall(weights[nn,:].!=0) for nn = Ne+1:Ncells])  # for I neurons collect all postsynaptic neurons

    # W = [weights,Ne,nzforEtoAll,nzforItoAll,Array{Int64,2}(undef,0,0)]
    return W
end

   @code_warntype create_weights(sim_pars["Ncells"], network_pars)