If Division II league posts are allowed here, the naive solution to this integer decomposition problem, while simple and working fine for small N, its “algorithm” is so bad that the execution time increases exponentially with N (compare it to Gustaphe’s turbo solution).
A rough estimate is that the naive algorithm would take about 1 Million years to run for case N=100 in a normal laptop (if it wouldn’t run out of memory long before that).
NB:
It would be nice to have a package that estimates the run time and disk space required for such massive problems, where complexity increases dramatically with some parameter N.