IncompleteLU.jl efficiency much worse than Matlab ilu

Hello Julia user,

I’m developing a model with large matrix and systems of linear equation solver. I plan to change the original matlab code in Julia as I heard Julia is much faster than MatLab.

However, I found ilu in Julia is very slow and can lead to out of memory with small tolerance, but ilu() on matlab can always finish in 1s. My sparse matrix size is 10^6*10^6. Did I use a wrong function or variable format (current is SparseMatrixCSC)?

I used @btime to measure the elapsed time, but I found without @btime the code runs much faster.

I’m a green hand to Julia. Thank you for any replies and suggestions in advance!

1 Like

Could you show a minimum working example demonstrating the issue?

2 Likes

Welcome to the Julia forum! :wave:

A small remark on @btime – that of course runs the code more than once, to report proper timing and not just once like e.g. @time (or MATLABs tic() ... toc()), cf Manual · BenchmarkTools.jl – or use @benchmark to see a larger summary of that.
So for sure without benchmarking the code is faster.

1 Like

Sorry to revive this thread but it was never concluded and I ran across it now.

By default, MATLAB computes the no-fill ILU decomposition “ILU(0)”. The Julia package implementing the equivalent factorization is ILUZero.jl. This is different from IncompleteLU.jl (which appears to be used by the original post), which implements the Crout ILU. The Crout ILU is also available in MATLAB via non-default options.

So the original issue appears to be a result of comparing different algorithms computing different factorizations.

2 Likes