Function for finding addition decompositions of an integer

Hello. Please tell me if Julia has a function for splitting a number into the sum of its terms (preferably without repetitions)

Example:
fun(5)
[1,4], [2,3]

fun(7)
[1,6], [2,5], [3,4], [1,4,2]

fun(10)
[1,9], [2,8], [3,7], [4,6], [1,4,5], [1,4,3,2]…

I think you’re expressing the question wrong. You are looking for all combinations of integers that sum to a given integer.

That said, I don’t know how to do this without resorting to writing a disgusting recursion myself, but I would be interested to hear good suggestions.

Here’s my quick hack:

function combinations!(list, root, N)
    if iszero(N)
        return push!(list, root)
    end
    start = isempty(root) ? 1 : last(root)
    for n = start:N
        arr = vcat(root, n)
        combinations!(list, arr, N-n)
    end
end

function combinations(N)
    list = Vector{Int64}[]
    combinations!(list, Int64[], N)
    return list
end

Note, this does include duplicates, so combinations(3) returns [[1, 1, 1], [1, 2], [2, 1], [3]]. Could probably be amended but I don’t feel like it. Nvm I fixed it. Just use start=1 if you prefer to include duplicates. And last(root) + 1 if you want to exclude solutions with non-unique numbers.

1 Like

Using Combinatorics.jl (and collect-free after @DNF’s advice):

using Combinatorics

function sum_combinations(N)
    list = Vector{Int64}[]
    for i in 2:N
        for x in combinations(1:N,i)
            if sum(x) == N
                push!(list, x)
            end
        end
    end
    return list
end

julia> sum_combinations(7)
4-element Vector{Vector{Int64}}:
 [1, 6]
 [2, 5]
 [3, 4]
 [1, 2, 4]
3 Likes

If you intend to do this to larger numbers, you will need to optimize the code a bit. Some information you can use: an upper bound for the length of list can be determined using some combinatorics (and Wikipedia I would guess), and an upper bound for the length of arr is given by n (non-unique) or -1/2 + sqrt(2N+1/4) (unique).

partitions, with Combinatorics.jl: Combinatorics.jl/partitions.jl at c2114a71ccfc93052efb9a9379e62b81b9388ef8 · JuliaMath/Combinatorics.jl · GitHub

4 Likes

That’s probably simpler, but it clocks in at three orders of magnitude slower than my version, because it generates and discards many arrays.

2 Likes

Why are you using collect? :grimacing:

Never use collect (If I ever get a tattoo, it will say that.)

3 Likes

For completeness,

Iterators.filter(allunique, partitions(N))

is a neat solution to the problem, and not a lot slower than my recursion thingy, even if you include a collect step.

3 Likes

@DNF, LOL :rofl:
Thanks, fixed it but it didn’t make the schmilblick move ahead, for the reason Gustaphe explained.

2 Likes

For N=100, your recursive solution runs in less than 0.5 s while (sorry DNF!): collect(Iterators.filter(allunique, partitions(N))) will take >90 s. Never mind about my naif version…

1 Like

I think this is pretty interesting, running benchmarkcombinatorics(20) (100 made julia panic and kill itself):

using Combinatorics, BenchmarkTools, BenchmarkPlots, StatsPlots

function gustaphe_combinations!(list, root, N; unique=true)
    if iszero(N)
        return push!(list, root)
    end
    start = isempty(root) ? 1 : last(root) + unique
    for n = start:N
        arr = vcat(root, n)
        gustaphe_combinations!(list, arr, N-n)
    end
end

function gustaphe_combinations(N; unique=true)
    list = Vector{Int64}[]
    gustaphe_combinations!(list, Int64[], N; unique)
    return list
end

function mschauer_combinations(N; unique=true)
    p = partitions(N)
    unique && return collect(Iterators.filter(allunique, p))
    return collect(p)
end

function benchmarkcombinatorics(N)
    suite = BenchmarkGroup()
    suite[:gustapheunique] = @benchmarkable gustaphe_combinations($N; unique=true)
    suite[:gustaphenonunique] = @benchmarkable gustaphe_combinations($N; unique=false)
    suite[:mschauerunique] = @benchmarkable mschauer_combinations($N; unique=true)
    suite[:mschauernonunique] = @benchmarkable mschauer_combinations($N; unique=false)
    res = run(suite)
    plot(res, [:gustapheunique, :gustaphenonunique, :mschauerunique, :mschauernonunique]; yscale=:log10, fontfamily="Computer Modern")
end

combinatorics

Obviously, the partitions route gives you an ordered iterator, and is just generally prettier code, but mine does perform better. Especially if you do the unique ones, which isn’t surprising, but even in the non-unique case.

2 Likes

You should not use unique, but rather test for a strictly decreasing sequence as it is sorted integers?

Thank you all, but all functions are very slow.
In my case, N can be 10,000 or more

That makes sense. I tried it with x->all(x[1:end-1] .> x[2:end]), which didn’t improve it by that much. But I don’t know if it’s clever enough to short-circuit that one, so if you can think of a better isstrictlydecreasing lmk.

Partitioning 10 000 is going to be absolutely monstrous no matter how you do it. I let length(partitions(10_000)) run for a couple of minutes and it hasn’t returned yet. Note that that is the efficient length it calculates from the generator, so even figuring out how many partitions there are takes longer than I have patience to wait for. Sure, that’s for the non-unique case, but I would not expect there to be an algorithm that isn’t “very slow” to do this.

You’ll want something that is parallelizeable so you can put it in a cluster, so the recursion method is out the window.

It seems the number of partitions approximately approaches \exp\left(\pi\sqrt{\frac{2n}{3}}\right) as n grows, so p(10 000) \approx 2.5e111. You do not want to do this.

10000? We are approaching the number of atoms in the visible universe A000009 - OEIS

4 Likes

yes, I already realized that this is not possible

No worries.

Fredrik Johansson has the record for calculating the number of partitions. You don’t actually compute the partitions to do so:

https://fredrikj.net/blog/2014/03/new-partition-function-record/

6 Likes

What a delightful article

1 Like