Planning to release a differentiable FDTD package for inverse design in photonics, acoustics and RF. I’ll include examples on designing stacks, gratings and couplers as well as meta-materials and meta-surfaces. The electromagnetic FDTD update is simple, drawing on earlier patterns in FDTD.jl and uFDTD.jl based on Schneider’s uFDTD book. Autodiff uses DiffEqFlux.jl neural ODEs and fits within the SciML ecosystem. We have fully featured boundary conditions and support dispersive lossy materials (had to pull some hacks to appease Zygote.jl here). Meant as a differentiable, inverse design focused version of a commercial solver like Ansys Lumerical.
Feature requests welcome Academic collaborators, grant cooperation and corporate sponsorship welcome.
I would be interested in testing something like this. I’ve been meaning to set up a simulation of a pulsed laser in a cavity (possibly with an absorptive material). If this package could do different electromagnetic sources (like a Gaussian pulse) that would be very interesting.
Yeah side injection of Gaussian pulse will be a standard source in our code. The update loop is fully exposed so you can also inject whatever additive source you want.
This is something I would be very interested in. I’ve written some toy FDTD codes in Julia and have considered putting in the work to make a good implementation for the General registry, but haven’t had the time lately to work more on it. Making it AD-compatible is something I’d considered but wasn’t even sure would be possible (or at least practical), so hats off for making that work.
Do you have a repo for it on GitHub? I’d love to check it out and see if there’s anything I could contribute.
Thanks I’ll release it around end of this month. The AD piggy backs off DiffEqFlux.jl which uses constant memory wrt timesteps. I’m still testing on toy inverse design problems in Google’s Ceviche challenges in invrs.io ported to Julia.
Hi Alec adjoints are done in time domain. Problem is treated as a neural ODE. DiffEqFlux.jl integrates it backwards to recover intermediates for reverse mode AD.
Everything can be on the GPU but I haven’t tested it there. Takes only couple lines of code to move to/from GPU on Flux.jl / Cuda.jl. Probably have to tweak some code but nothing major.
Would you be interested in trying my AD package FastDifferentiation.jl? I’m always looking for new applications.
Not sure if it will work in your case but for problems in its domain it can be substantially faster than other AD packages. I’ll help you work through any problems you encounter.
Thanks - interesting work! I’ll read up on your approach. It seems a major advantage is a more natural way of handling higher order and mixed derivatives, which would benefit the PINN folks or loss functions involving derivatives? .
Yes, I find it much simpler to specify derivatives this way. If you need a Jacobian rather than just a gradient FastDifferentiation can also be much faster, hundreds of times faster in some cases.