Hi @odow, I followed all your advices.
This are the results of a test that first run everything
using Clp0.7.1, and then repeat the JuMP part with Clp0.8.0. The test is available in github
This is on macOS
Running test with Clp0.7.1 ------------------
Your branch is up to date with 'origin/clp_0.7.1'.
loaded /Users/Pereiro/.julia/config/startup.jl
vesioninfo -------------------
Julia Version 1.1.0
Commit 80516ca202 (2019-01-21 21:24 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin14.5.0)
CPU: Intel(R) Core(TM) i5-8210Y CPU @ 1.60GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
Environment:
JULIA_NUM_THREADS = 4
JULIA_EDITOR = code
Project.toml -------------------
Status `~/University/Studying/JuMP_issue/Project.toml`
[6e4b80f9] BenchmarkTools v0.5.0
[e2554f3b] Clp v0.7.1
[60bf3e95] GLPK v0.13.0
[682c06a0] JSON v0.21.0
[4076af6c] JuMP v0.21.2
[fdba3010] MathProgBase v0.7.8
[b77e0a4c] InteractiveUtils
[44cfe95a] Pkg
[2f01184e] SparseArrays
Model: toy_model.json size: (5, 8) -------------------
fba_JuMP-GLPK.Optimizer
329.750 μs (1831 allocations: 112.19 KiB)
obj_val: 3.181818181818181
fba_JuMP-Clp.Optimizer
930.371 μs (3214 allocations: 193.47 KiB)
obj_val: 3.1818181818181817
fba_MathProgBase-ClpSolver
152.660 μs (32 allocations: 2.47 KiB)
obj_val: 3.1818181818181817
Model: e_coli_core.json size: (72, 95) -------------------
fba_JuMP-GLPK.Optimizer
1.965 ms (10908 allocations: 533.33 KiB)
obj_val: 0.8739215069685011
fba_JuMP-Clp.Optimizer
3.209 ms (19681 allocations: 973.48 KiB)
obj_val: 0.8739215069684311
fba_MathProgBase-ClpSolver
710.152 μs (32 allocations: 11.58 KiB)
obj_val: 0.8739215069684311
Model: iJR904.json size: (762, 976) -------------------
fba_JuMP-GLPK.Optimizer
44.393 ms (101450 allocations: 4.41 MiB)
obj_val: 0.5782403962872187
fba_JuMP-Clp.Optimizer
49.131 ms (190924 allocations: 14.62 MiB)
obj_val: 0.5782403962871316
fba_MathProgBase-ClpSolver
18.327 ms (35 allocations: 118.00 KiB)
obj_val: 0.5782403962871316
Model: HumanGEM.json size: (8461, 13417) -------------------
fba_JuMP-GLPK.Optimizer
18.933 s (1300964 allocations: 61.45 MiB)
obj_val: 2.334553007169305
fba_JuMP-Clp.Optimizer
4.307 s (2456567 allocations: 1.43 GiB)
obj_val: 2.334553007230854
fba_MathProgBase-ClpSolver
507.105 ms (45 allocations: 1.35 MiB)
obj_val: 2.3345530071688514
Running test with Clp up to date ------------------
Your branch is up to date with 'origin/clp_up_to_date'.
loaded /Users/Pereiro/.julia/config/startup.jl
vesioninfo -------------------
Julia Version 1.1.0
Commit 80516ca202 (2019-01-21 21:24 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin14.5.0)
CPU: Intel(R) Core(TM) i5-8210Y CPU @ 1.60GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
Environment:
JULIA_NUM_THREADS = 4
JULIA_EDITOR = code
Project.toml -------------------
Status `~/University/Studying/JuMP_issue/Project.toml`
[6e4b80f9] BenchmarkTools v0.5.0
[e2554f3b] Clp v0.8.0
[60bf3e95] GLPK v0.13.0
[682c06a0] JSON v0.21.0
[4076af6c] JuMP v0.21.3
[fdba3010] MathProgBase v0.7.8
[b77e0a4c] InteractiveUtils
[44cfe95a] Pkg
[2f01184e] SparseArrays
Model: toy_model.json size: (5, 8) -------------------
fba_JuMP-GLPK.Optimizer
327.498 μs (1831 allocations: 112.19 KiB)
obj_val: 3.181818181818181
fba_JuMP-Clp.Optimizer
885.740 μs (3297 allocations: 197.16 KiB)
obj_val: 3.1818181818181817
Model: e_coli_core.json size: (72, 95) -------------------
fba_JuMP-GLPK.Optimizer
1.870 ms (10908 allocations: 533.33 KiB)
obj_val: 0.8739215069685011
fba_JuMP-Clp.Optimizer
3.016 ms (20216 allocations: 860.78 KiB)
obj_val: 0.8739215069684309
Model: iJR904.json size: (762, 976) -------------------
fba_JuMP-GLPK.Optimizer
39.895 ms (101450 allocations: 4.41 MiB)
obj_val: 0.5782403962872187
fba_JuMP-Clp.Optimizer
35.624 ms (203135 allocations: 7.03 MiB)
obj_val: 0.5782403962871316
Model: HumanGEM.json size: (8461, 13417) -------------------
fba_JuMP-GLPK.Optimizer
12.222 s (1300962 allocations: 61.42 MiB)
obj_val: 2.334553007169305
fba_JuMP-Clp.Optimizer
929.113 ms (2617068 allocations: 90.43 MiB)
obj_val: 2.3345531079323396
Your branch is up to date with 'origin/master'.
This is on linux
Your branch is up to date with 'origin/master'.
Running test with Clp0.7.1 ------------------
Branch 'clp_0.7.1' set up to track remote branch 'clp_0.7.1' from 'origin'.
vesioninfo -------------------
Julia Version 1.1.1
Commit 55e36cc308 (2019-05-16 04:10 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: Intel(R) Core(TM) i3-8100 CPU @ 3.60GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
Project.toml -------------------
Status `~/University/Projects/MaxEntEp/GitWorker/MaxEntEP_GWRepo/Worker2/JuMP_issue-gitworker-copy/origin/JuMP_issue.jl/Project.toml`
[6e4b80f9] BenchmarkTools v0.5.0
[e2554f3b] Clp v0.7.1
[60bf3e95] GLPK v0.13.0
[682c06a0] JSON v0.21.0
[4076af6c] JuMP v0.21.2
[fdba3010] MathProgBase v0.7.8
[b77e0a4c] InteractiveUtils
[44cfe95a] Pkg
[2f01184e] SparseArrays
Model: toy_model.json size: (5, 8) -------------------
fba_JuMP-GLPK.Optimizer
287.398 μs (1831 allocations: 112.19 KiB)
obj_val: 3.181818181818181
fba_JuMP-Clp.Optimizer
685.555 μs (3214 allocations: 193.47 KiB)
obj_val: 3.1818181818181817
fba_MathProgBase-ClpSolver
90.368 μs (32 allocations: 2.47 KiB)
obj_val: 3.1818181818181817
Model: e_coli_core.json size: (72, 95) -------------------
fba_JuMP-GLPK.Optimizer
1.661 ms (10913 allocations: 533.41 KiB)
obj_val: 0.8739215069685011
fba_JuMP-Clp.Optimizer
2.387 ms (19691 allocations: 973.64 KiB)
obj_val: 0.8739215069684311
fba_MathProgBase-ClpSolver
550.795 μs (32 allocations: 11.58 KiB)
obj_val: 0.8739215069684311
Model: iJR904.json size: (762, 976) -------------------
fba_JuMP-GLPK.Optimizer
37.492 ms (101449 allocations: 4.41 MiB)
obj_val: 0.5782403962872187
fba_JuMP-Clp.Optimizer
36.266 ms (190925 allocations: 14.62 MiB)
obj_val: 0.5782403962871316
fba_MathProgBase-ClpSolver
15.086 ms (35 allocations: 118.00 KiB)
obj_val: 0.5782403962871316
Model: HumanGEM.json size: (8461, 13417) -------------------
fba_JuMP-GLPK.Optimizer
9.987 s (1300959 allocations: 61.41 MiB)
obj_val: 2.334553007169305
fba_JuMP-Clp.Optimizer
2.198 s (2456567 allocations: 1.43 GiB)
obj_val: 2.334553007226292
fba_MathProgBase-ClpSolver
368.350 ms (45 allocations: 1.35 MiB)
obj_val: 2.334553007168189
Running test with Clp up to date ------------------
Branch 'clp_up_to_date' set up to track remote branch 'clp_up_to_date' from 'origin'.
vesioninfo -------------------
Julia Version 1.1.1
Commit 55e36cc308 (2019-05-16 04:10 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: Intel(R) Core(TM) i3-8100 CPU @ 3.60GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
Project.toml -------------------
Status `~/University/Projects/MaxEntEp/GitWorker/MaxEntEP_GWRepo/Worker2/JuMP_issue-gitworker-copy/origin/JuMP_issue.jl/Project.toml`
[6e4b80f9] BenchmarkTools v0.5.0
[e2554f3b] Clp v0.8.0
[60bf3e95] GLPK v0.13.0
[682c06a0] JSON v0.21.0
[4076af6c] JuMP v0.21.3
[fdba3010] MathProgBase v0.7.8
[b77e0a4c] InteractiveUtils
[44cfe95a] Pkg
[2f01184e] SparseArrays
Model: toy_model.json size: (5, 8) -------------------
fba_JuMP-GLPK.Optimizer
290.409 μs (1831 allocations: 112.19 KiB)
obj_val: 3.181818181818181
fba_JuMP-Clp.Optimizer
695.442 μs (3297 allocations: 197.16 KiB)
obj_val: 3.1818181818181817
Model: e_coli_core.json size: (72, 95) -------------------
fba_JuMP-GLPK.Optimizer
1.677 ms (10913 allocations: 533.41 KiB)
obj_val: 0.8739215069685011
fba_JuMP-Clp.Optimizer
2.409 ms (20217 allocations: 860.80 KiB)
obj_val: 0.8739215069684309
Model: iJR904.json size: (762, 976) -------------------
fba_JuMP-GLPK.Optimizer
37.382 ms (101449 allocations: 4.41 MiB)
obj_val: 0.5782403962872187
fba_JuMP-Clp.Optimizer
28.847 ms (203010 allocations: 7.03 MiB)
obj_val: 0.5782403962871316
Model: HumanGEM.json size: (8461, 13417) -------------------
fba_JuMP-GLPK.Optimizer
9.924 s (1300958 allocations: 61.39 MiB)
obj_val: 2.334553007169305
fba_JuMP-Clp.Optimizer
700.783 ms (2617005 allocations: 90.41 MiB)
obj_val: 2.334553007169208
Your branch is up to date with 'origin/master'.