[ANN] Exodus.jl v0.14.1 - Performant mesh IO that plays nice with MPI and juliac

I’m pleased to announce that I just released a new version of Exodus.jl, a julia interface to the exodusII data format that is frequently used in finite element codes. This file format is readily viewable in paraview in serial or in parallel with decomposed meshes. It is amongst one of the most performant file formats used in this area as can be seen from the file size benchmarks , IO speed benchmarks, and memory usage benchmarks.

In this new release, minor updates have been made so the package is compatable with static compilation via juilac. There were just a few print statements to patch up, so I’d say my overall experience with juliac so far has been great! Excellent work!

Below is an example snippet of code that can be used to statically compile an executable that comes in at 3.7 Mb on my machine. This example uses the system MPI library mainly due to static compilation issues encountered with MPI.jl and opens an exodusII file and copies it so then data could be written to be visualized in paraview later.

Here is an example script.jl file

using Exodus

const libmpi = "/usr/lib/x86_64-linux-gnu/libmpi.so.12"
const MPI_Comm = Ptr{Cvoid}
const MPI_COMM_WORLD = Cint(0x44000000)

Base.@ccallable function main()::Cint
    # Initialize MPI
    ccall((:MPI_Init, libmpi), Cint, (Ptr{Cvoid}, Ptr{Cvoid}), C_NULL, C_NULL)

    # get rank and total number of ranks
    rank = Ref{Cint}()
    size = Ref{Cint}()
    ccall((:MPI_Comm_rank, libmpi), Cint, (Cint, Ptr{Cint}), MPI_COMM_WORLD, rank)
    ccall((:MPI_Comm_size, libmpi), Cint, (Cint, Ptr{Cint}), MPI_COMM_WORLD, size)

    println(Core.stdout, "Hello from rank $(rank[]) of $(size[])")

    # open mesh file
    file_name = "hole_array.exo.$(size[]).$(rank[])"
    exo = ExodusDatabase{Int32, Int32, Int32, Float64}(file_name, "r")
    println(Core.stdout, "$exo")

    new_file_name = "output.exo.$(size[]).$(rank[])"
    copy(exo, new_file_name)

    # then do some stuff ...

    # Finalize MPI
    ccall((:MPI_Finalize, libmpi), Cint, ())

    return 0
end

which can be compiled with the following command

julia +1.12 --project=@. ~/.julia/juliaup/julia-1.12.0-beta4+0.x64.linux.gnu/share/julia
/juliac.jl --output-exe a.out --compile-ccallable --experimental --trim script.jl

and then finally run with the following shell commands (assuming you have made a Project.toml file with Exodus.jl in the deps section.

# decompose mesh into number of shards equaling MPI ranks you wish to use
julia +1.12 --project=@. -e 'using Exodus; decomp("hole_array.exo", 4)'
# run executable
mpirun -n 4 ./a.out
# optional - stitch output mesh shards back together
# paraview can visualize sharded exodus files but with a performance hit
# as the number of MPI ranks increases.
julia +1.12 --project=@. -e 'using Exodus; epu("output.exo")'
14 Likes

Do you have some tips regarding what to pay attention to, for juliac-compatible package design?

4 Likes

My advice is as follows

  1. @code_warntype is your friend. If you’re not familiar with this macro, it will detect type instabilities on a method call. A type unstable method is a non-starter for juliac.
  2. Whatever functionality you’re using from dependencies needs to play nice with juliac as well. For instance, I ran into issues with MPI.jl in the example above and instead resorted to raw ccalls to my system installed MPI library.
  3. Start small. Don’t try to compile your whole workflow the first go around with juliac. This mesh IO package happened to be the logical place to start messing with juliac in my workflow.

One of the first problems I ran into with my package was the following. I have a general constructor ExodusDatabase(file_name, mode) which is type unstable since it needs to peak at the file to figure out the types for the return type ExodusDatabase{M, I, B, F} where {M, I, B, F}. This threw a bunch of errors from juliac. Instead I resorted to another constructor in my package ExodusDatabase{Int32, Int32, Int32, Float64}(file_name, mode) which requires knowledge of the data types in the file a priori.

Other issues I encountered were operations using map that juliac could not reason about the return type. In that case, I resorted to pre-allocating arrays and using raw for loops.

5 Likes

Thanks @Craig_Hamel !

1 Like