About the implementation of SciMLOperators.FunctionOperator

I am looking into SciMLOperators to define convolution linear operators.
I was wondering why FunctionOperator requires prototypes of the input and output.
Would the sizes not be sufficient?
Also, it seems to me that these prototypes are stored. Why is that? Are they used internally. Is it dangerous to use them outside the FunctionOperator object?

Thanks,
Sébastien

Yeah that’s bad. It seems like it’s all:

which should be allocating. Trying to make * non-allocating is doomed to fail in bad ways. Please open an issue.

For reference, the docs say:

SciMLOperators is not ready for prime time use yet. Be warned that any user is testing an early and possibly buggy form. This note will be removed when the library is officially released.

and I really do mean that. I have not vetted all of the code yet and you will find things like this still. Open issues and help the development if you want to use it in practice, or wait until the v1.0 as until then it’s a “there be dragons” library.

1 Like

Hi,
thanks for your quick reply.
I’m aware that SciMLOperators is still in early developpement. I am doing a long time investment, here!
I have a practical application for this library, I could probably give you some feedback.
I do want to help, if I can. I’ll open an issue.
Also, regarding “convolution operators” (or whatever you would like to call them), I think they would be a useful addition to this library.
The (reshaped) input of the operator is a d-dimensional grid, and the output is also a d-dimensional grid, which is the result of a convolution of the input with some kernel. The kernel is defined frequency-wise, in Fourier space. FunctionOperator are created by providing a mul function, while ConvolutionOperator would be created by providing a frequencywise_mul.
If you think that this might be interesting, I can elaborate.

Best,
Sébastien

They would be. They should probably be an extension package since they would require a lot of vectorization tools in order to do the stenciling in a cache-friendly way, but they would be good to have. Open an issue for that. I believe LoopVectorization.jl does some loop reordering for this but we would need to double check.

Okay cool, yeah I always just have to note that because if I start helping without mentioning too much people are like “why is this so broken? Is all of SciML broken? You just release broken software?” etc. etc. even though we place a big strong warning on the first page of the docs for the packages that aren’t meant to be treated as released software :sweat_smile:. We know that this library isn’t up to par so I just have to make sure to communicate that, and share it anyways so we can get early feedback like this. Thanks for your understanding and the understanding of anyone reading this.

Hi,
just to let you know I’ve opened an issue:

I will open a discussion about convolution operators, too.

Best,

Sébastien

1 Like

I guess you are talking about convolution in the real space.
The kernels I deal with do not have a bounded support, and I work in the Fourier space. So most of the heavy duty is handled by FFTW.jl!

But you’re right, convolution operators in the real space would also be useful (if only for n-d image analysis).

@sbrisard take a look here for an example of implementing fourier type operators (in freq or real space) using SciMLOperators.

2 Likes

Update on this discussion from github:

  1. it was clarified that cache is needed to support a 5-arg mul!, though we plan to allow a user to pass on a 5-arg mul!
  2. caching has been made optional. It can be avoided by forming FunctionOperator as follows: FunctionOperator(func, u_in, u_out; ifcache = false, other_kwargs...)
  3. We have added tests to ensure that caching doesn’t lead to dangerous behaviour - such as corrupting data returned by FunctionOperator.
1 Like

@vpuri3 Indeed, this is quite interesting. I will investigate. Thanks!