I’m pleased to announce that I just released a new version of Exodus.jl, a julia interface to the exodusII data format that is frequently used in finite element codes. This file format is readily viewable in paraview in serial or in parallel with decomposed meshes. It is amongst one of the most performant file formats used in this area as can be seen from the file size benchmarks , IO speed benchmarks, and memory usage benchmarks.
In this new release, minor updates have been made so the package is compatable with static compilation via juilac. There were just a few print statements to patch up, so I’d say my overall experience with juliac so far has been great! Excellent work!
Below is an example snippet of code that can be used to statically compile an executable that comes in at 3.7 Mb on my machine. This example uses the system MPI library mainly due to static compilation issues encountered with MPI.jl and opens an exodusII file and copies it so then data could be written to be visualized in paraview later.
Here is an example script.jl file
using Exodus
const libmpi = "/usr/lib/x86_64-linux-gnu/libmpi.so.12"
const MPI_Comm = Ptr{Cvoid}
const MPI_COMM_WORLD = Cint(0x44000000)
Base.@ccallable function main()::Cint
# Initialize MPI
ccall((:MPI_Init, libmpi), Cint, (Ptr{Cvoid}, Ptr{Cvoid}), C_NULL, C_NULL)
# get rank and total number of ranks
rank = Ref{Cint}()
size = Ref{Cint}()
ccall((:MPI_Comm_rank, libmpi), Cint, (Cint, Ptr{Cint}), MPI_COMM_WORLD, rank)
ccall((:MPI_Comm_size, libmpi), Cint, (Cint, Ptr{Cint}), MPI_COMM_WORLD, size)
println(Core.stdout, "Hello from rank $(rank[]) of $(size[])")
# open mesh file
file_name = "hole_array.exo.$(size[]).$(rank[])"
exo = ExodusDatabase{Int32, Int32, Int32, Float64}(file_name, "r")
println(Core.stdout, "$exo")
new_file_name = "output.exo.$(size[]).$(rank[])"
copy(exo, new_file_name)
# then do some stuff ...
# Finalize MPI
ccall((:MPI_Finalize, libmpi), Cint, ())
return 0
end
which can be compiled with the following command
julia +1.12 --project=@. ~/.julia/juliaup/julia-1.12.0-beta4+0.x64.linux.gnu/share/julia
/juliac.jl --output-exe a.out --compile-ccallable --experimental --trim script.jl
and then finally run with the following shell commands (assuming you have made a Project.toml
file with Exodus.jl
in the deps
section.
# decompose mesh into number of shards equaling MPI ranks you wish to use
julia +1.12 --project=@. -e 'using Exodus; decomp("hole_array.exo", 4)'
# run executable
mpirun -n 4 ./a.out
# optional - stitch output mesh shards back together
# paraview can visualize sharded exodus files but with a performance hit
# as the number of MPI ranks increases.
julia +1.12 --project=@. -e 'using Exodus; epu("output.exo")'