Regarding the FDTD solver Meep: GitHub - NanoComp/meep: free finite-difference time-domain (FDTD) software for electromagnetic simulations. Does anyone have any interest in either a Julia package to interface with Meep, or an actual port of the software into native Julia? I’m considering working on this, but it’s a fairly significant effort and I thought it was worth first gauging what other users and vested interests thought.
From a quick look through the C++ code, here’s some random thoughts I’d have on a native port:
Pro: I can definitely see patterns where Julia features could make the codebase simpler, e.g.: parametric types, multiple dispatch, etc.
Pro: Some of the dependency management would be a bit simpler for end-users who can rely on Julia’s Pkg system vs dealing directly with system-installed C++ libraries.
Con: Requiring the use of Julia’s Pkg system to install and manage dependencies could make Meep more difficult to install on systems that aren’t connected to the open internet.
Pro: Having the codebase in Julia might lower the bar to entry for potential developers with the required education/background but without a background in C++.
Con: Without a transition of the existing user/developer community, there’s going to be some maintenance burden (possibly on both sides) to cross-port/implement continued improvements.
@stevengj I recently watched your Meepcon 2022 talk; do you have any insight into ongoing work or potential cross-interest between the Julia/Meep communities?
I’m working to open source an FDTD engine we wrote in Julia. It is heavily based on meep and uses many of the meep semantics (chunking, heterogenous stencils, etc). We’ve also written it to work on many (distributed) GPU platforms.
I’ve started the process internally, but my company requires extensive privacy and legal reviews. So it will take some time.
Many of the pros you listed (eg multiple dispatch) indeed made writing something like this much much easier in Julia.
We’re hoping to engage the community once we get approval and can open source the code.
Thanks @garrek for informing me. And @smartalecH definitely curious to see your work. In comparison my package is designed to be minimalist and hackable - AD compatible for inverse design without custom adjoints. Its forward simulation and adjoints are not heavily optimized for speed or as feature rich like Meep or a Julia port of Meep. BTW a new differentiable FDTD package built purely for speed is fdtdz by Stanford spin off spinsphotonics
BTW a new differentiable FDTD package built purely for speed is fdtdz by Stanford spin off spinsphotonics
@pxshen I’ve met with Jesse a few times to talk about fdtdz. He’s done a good job, and approaches things rather uniquely. Fun fact: he actually tried to implement everything in Julia at first, but couldn’t leverage all the Cuda features he wanted at the time (Cuda.jl has come a long way since then).
He’s got a good jax wrapper around his ptx kernels if anyone’s interested.
My understanding is that well-implemented FDTD for CPU is usually constrained by memory bandwidth, so anything you can do to improve cache efficiency and alleviate memory bottlenecks will lead to big improvements.
Currently MPB, mainly because I wrote it before Meep (which has the advantage of supporting periodic-waveguide modes, and the disadvantages of finding ω(k), requiring a Newton solve for k(ω), as well as not supporting dispersive materials). We’ve thought about plugging in additional mode solvers, e.g. wgms3d.
If you only want to support constant cross-section waveguides, an FDFD mode solver shouldn’t be hard to write. In principle one should write a mode solver that uses exactly the same spatial discretization as your FDTD code, including a discretized propagation direction and discretized time (there is a correction factor for the frequency discussed in our Meep paper), so that you exactly match the numerical dispersion of your FDTD code.
Thanks @stevengj for the thorough explanation. Matching the spatial discretization is indeed important considering especially there’re only a few cells in the height direction of a thin waveguide. Thanks @smartalecH for mentioning VectorModesolver - very easy to use - simple wrapper around EMpy
Actually I think Alec released Khronos.jl couple weeks ago. As for Lumi, we compiled Julia FDTD backend to binary with PackageCompiler.jl so it’s much faster now. New notebook w/ installation, sparams and inverse design at Google Colab