Make a variable as a global variable within a function

I’d like to specify a key that acts as a global variable only within the function.
For example,

function my_func(x, y)
    @log x  # will be saved as `my_dict[:x] = x`
end

function my_func2(x)
    my_dict = Dict()
    my_func(x, 2*x)  # without adding a new argument `my_dict`
end

Can such behaviour be realised?

NOTE: the main purpose of this is to conveniently save a data annotated by a macro.

const SAVED = Dict{Symbol, Any}()

function savedatum(name, value)
    SAVED[name] = value
end

Something like that?

Yes, but a single global variable would behave undesirably when calliing savedatum in parallel computing.
Is it possible to save data separately?

For example, suppose there is a macro LOG such that LOG makes

@LOG function my_func2(x, y)
    my_dict = Dict()
    my_func(x, 2*x)
end

as

function my_func2_LOG(x)
    my_dict[:x] = x
    return my_dict
end

The examples are rather confusing to me. But that description sounds to me as if you were looking for closures.

I don’t quite understand, I’m afraid. You don’t want a global variable, but you want different functions running in parallel saving to one central… variable?

You can use GitHub - wherrera10/ThreadSafeDicts.jl: Thread safe Julia Dict instead of Dict

Sorry for my poor explanation. Actually, I don’t exactly know how to achieve what I want. So my example would not make sense.

I’m texting on my phone now, so I’ll leave details later.

If you don’t mind, can you explain what closure is? Or, share any references :slight_smile:

EDIT: I found an explanation about closure in the Julia document.
If you mean this, yeah, it’s related to what I want (Actually, I often use this way but I didn’t know the name Closure :slight_smile: ).

As indicated in the manual they are callable objects (i.e. functions) that capture variables. So they can be used to “remember” some values across function calls (as when you are using globals), but without globals, and particular instances of such “fake globals” for each instance of the closure.

As an example, the function makegrow here returns an anonymous function that captures the value of x, and retains the modifications that are made to it.

function makegrow(x)
    return (y) -> begin
        x = x + y
        return x
    end
end

So:

julia> f = makegrow(1) # starting value for `f`: x=1
#1 (generic function with 1 method)

julia> g = makegrow(11) # starting value for `g`: x=11
#1 (generic function with 1 method)

julia> f(2) # add 2 to 1
3

julia> g(2) # add 2 to 11
13

julia> f(2) # add further 2 to 1+2=3
5

julia> g(2) # same...
15
2 Likes

Very nice explanation. Thanks!
I often used this to reduce computation load at the beginning (that is, in the example, to calculate some values depending on x before returning the nested function.).

But I’m not sure :slight_smile: Do you think it makes sense?

Closures may be used that way, but I didn’t really understand what you wanted to do, so I can’t tell if it’s a good idea for your use case. :roll_eyes:

Hi you all,
Probably the first question and example are too poor to understand my case.
I wrote my situation and what I wanna do in detail below:

Briefly speaking, I wanna save specific variables annotated by a macro (namely, @log).
More specifically, it is related to logging data in the use of DifferentialEquations.jl (abbrev.: DiffEq).

In DiffEq, I usually write a in-place DE (differential equation) like:

function sub_dynamics1!(dx, x, p, t)
    dx .= -1*x  # an example dynamics
end
function sub_dynamics2!(dx, x, p, t)
    dx .= -2*x  # an example dynamics
end

For nested systems, ODE sometimes calls subsystems’ ODE like:

function dynamics!(dx, x, p, t)
    sub_dynamics1!(dx.sub1, x.sub1, p.sub1, t)
    sub_dynamics2!(dx.sub2, x.sub2, p.sub2, t)
end

Note that the above “dot-access” (I mean, for example, dx.sub1) is possible via ComponentArrays.jl.

Now, I’d like to add a macro @log to indicate that the variable will be saved. For example, sub_dyanmics1! is changed as

function sub_dynamics1!(dx, x, p, t)
    @log x  # will be saved
    dx .= -1*x
end

This idea is based on SimulationLogs.jl. However, as discussed before, SimulationLogs.jl saves data “after simulation”.
This may be undesirable when the update of a simulation parameter (namely, p) occurs in stochastic sense. Also, I doubt that it works properly in the setting of thread-based parallel simulation.

So I wanna make (possibly some) macros, namely @log and @LOG, the former is for annotating saved variables and the latter is to transform dynamics! into a new function that contains a dictionary (which will contain variables annotated by @log).

I tested this functionality for some simple examples. The difficulty is, however, when a function (dynamics!) calls subsystems’ dynamics (sub_dynamics1! and sub_dynamics2!) and @log is added to sub_dynamics1! and sub_dynamics2!.

Here are my ideas:

  1. (probably the best) Make a dictionary, which acts as a global variable only in dynamics! (that is, it is merely a local variable outside dynamics! and accessible in sub_dynamics1! and sub_dynamics2!).
  2. (probably safe in terms of thread-based parallel simulation) Make some dictionaries, each of them acts as a global variable only in each thread, and add data to the thread-safe dictionaries.

Do you guys have solutions to my idea? Or, do you have a better idea?
Feel free to leave comments :slight_smile:

So the main reason I don’t do this in SimulationLogs.jl is that your function will be called multiple times per simulation time step and every solver algorithm works with the results of those function calls differently. So logging at the time of the call unfortunately won’t give you any useful information. I think @ChrisRackauckas wrote a post that covers this. I’ll see if I can find it.

From here:

  • Double check if you’re doing anything that violates assumptions of being an ODE. As an ODE, the right-hand side f function should always give you the same result: u' = f(u,p,t) needs to be uniquely defined. These issues are just fundamental to the mathematics: if you do these things in f , then f no longer defines an ODE so of course it cannot be solved!
  • If you put randomness into f , then the adaptivity will think your ODE is being solved at a high error because the derivative keeps changing, and therefore it will fail and hit dtmin trying to reduce the randomness to zero (if you do need randomness, use an SDE or RODE solver).
  • If your f function is modifying u , then calling f with different stepsizes is not deterministic and the solver is likely to fail. If you need to do this, you should be using a callback.
  • If your f function is caching values from a previous step, just think about what this means. If you change dt , then you’re changing f . In that sense, u' is no longer defined since it’s now dependent on how it’s being solved! Even worse, adaptive ODE solvers do not always move forwards in time: sometimes they try t + dt1 before trying t + dt2 and choosing what to do. So if you’re assuming that the cached values are from the last step, that’s not actually the case: those values can be coming from a fake (too incorrect according to error estimates) future!

One remedy that I’m trying to do is to simply transform dynamics! into a new function, namely log_func, which is compatible with SavingCallback. For example,

# log_func is generated by a macro, namely `@LOG`, from `dynamics!`
function log_func(_x, _t, integrator)
    __LOGGER_DICT__ = Dict()  # generated by `@LOG`
    x = copy(_x)
    t = copy(_t)
    p = copy(integrator.p)
    dx = zero.(_x)  # for in-place method; will be overwritten in the below line
    # contents of `dynamics!(dx, x, p, t)` with `@log x` => `__LOGGER_DICT__[:x] = x`
    return __LOGGER_DICT__
end

Note: I’m not sure but in SavingCallback document it is recommended to copy variables (_x = copy(x) in the above code), and __LOGGER_DICT__ is a dictionary, supposed to be generated via @LOG, which is the output of log_func.

This would resolve the issue you mentioned.

Maybe such a macro @log could be used outside sub_dynamics1! etc., at the same local level as the creation of the dictionary inside dynamics!, so that @log sub_dynamics1!(dx.sub1, x.sub1, p.sub1, t) wraps sub_dynamics1! into a function that first updates the dictionary and then runs the wrapped function. Would that work for you?

2 Likes

I don’t know how to do it, but it’s really a good approach :slight_smile:
If possible, saving would also be performed in a nested sense.

Oh gotcha. Yeah that makes sense. Another thing to keep in mind is how you’re handling the stochastic side of this. For example, if you are injecting sensor noise into a discrete controller, you’ll want that to happen in the PeriodicCallback you’re using to implement your controller. If you’re injecting plant noise in your derivative function, it’s going to interact incorrectly with the ODE solvers, especially if they are variable-step. An SDE solver is the correct way to handle that. In both cases, I’m not sure that you can recover any of the random variables.

1 Like

Well, I’ve not tried but SavingCallback may work for the cases of 1) ODE+ (stochastic) update via DiscreteCallback and 2) SDE. Is it right?
In the aspect of reproduction, I would specify rng and some simulation settings and just run a simulation again with the predefined random seed and settings.

Umm… actually I’m not familiar with macros.
I think that it is enough to solve the below problem:

function _my_func(a, b)
    c = 2*a
    b = b + 1
    a + b
end
function my_func(a)
    b = a
    @log _my_func(a, b)
end

Then, the definition of my_func should be

function my_func(a)
    b = a
    # the content of `_my_func`. It must not change the above `b`... how?
    c = 2*a
    b = b + 1
    a+b
    #
end

Do you have any idea?

EDIT:
Ah, generate a new function _my_func__LOG__ and run it may work.
But I don’t still know how to generate _my_func__LOG__ as the following.
This question is continued at the next question.

function my_func(a)
    b = a
    # the content of `_my_func`. It must not change the above `b`... how?
    function _my_func__LOG__(a, b)
        c = 2*a
        b = b + 1
        a+b
    end
    _my_func__LOG__(a, b)
    #
end