# Convex.jl: variables "disappearing" in cumulative constraint

Hello,

I am trying to build a sort of L_2-norm constraint piecewise, using
code like this (simplified form of the actual code):

``````    x = [Convex.Variable(m*s+2r, m*s+2r) for _ in 1:N];
for t in 1:N
@debug "x[\$(t)].id_hash is: " x[t]
end

Β± = [-1.0, +1.0];  # well, Julia does allow crazy variable names
total = 0;
for t in 1:N
@info "Starting computation of constraint A[\$(t)] at \$(now()) ..."
# 1. introduce sign perturbation `R`
a1 = (rand(Β±, size(x[t])) .* x[t]) + x[1];
# 2. 2D-convolution with PSF matrix `Q`
a2, cs = conv2d(Q, a1);
@debug "Result of conv2d(Q,x): " a2
# 3. build contraint
for i in 1:s, j in 1:s
total += a2[i,j] * a2[i,j];
end
end;  # for t
constraint = (total < 42);
@debug "final constraint is: " constraint
``````

Now the first 3 debug lines show that all involved variables have different IDs; in the final constraint however, all variable IDs are the one of `x[1]`. Hereβs an excerpt from the logs:

``````DEBUG: x[1].id_hash is id: 18435402678986543539
DEBUG: x[2].id_hash is id: 2006040311771778825
DEBUG: x[3].id_hash is id: 4770220102282770796
DEBUG: x[4].id_hash is id: 384228606706033487
DEBUG: x[5].id_hash is id: 10429728996535255788
[ Info: Starting computation of constraint A[1] at 2020-03-14T08:22:23.864 ...
β Debug: Entering conv2d(Q,x):
β   x =
β    + (affine; real)
β    ββ .* (affine; real)
β    β  ββ 102Γ102 Array{Float64,2}
β    β  ββ 102Γ102 real variable (id: 18435402678986543539)
β    ββ 102Γ102 real variable (id: 18435402678986543539)
β @ Main ~/w/BEAM.jl/issue.jl:26
β Debug: Result of conv2d(Q,x):
β   a2 =
β    index (affine; real)
β    ββ reshape (affine; real)
β       ββ * (affine; real)
β          ββ 14884Γ12424 SparseMatrixCSC{Float64,Int64}
β          ββ index (affine; real)
β             ββ reshape (affine; real)
β                ββ 122Γ122 real variable (id: 6461239074267313883)
β @ Main ~/w/BEAM.jl/issue.jl:100
...
β Debug: final constraint is:
β   constraint =
β    <= constraint (convex)
β    ββ + (convex; positive)
β    β  ββ 0
β    β  ββ qol_elem (convex; positive)
β    β  β  ββ index (affine; real)
β    β  β  β  ββ index (affine; real)
β    β  β  β     ββ reshape (affine; real)
β    β  β  β        ββ * (affine; real)
β    β  β  β           ββ 14884Γ12424 SparseMatrixCSC{Float64,Int64}
β    β  β  β           ββ index (affine; real)
β    β  β  β              ββ reshape (affine; real)
β    β  β  β                 ββ 122Γ122 real variable (id: 6461239074267313883)
β    β  β  ββ [1.0]
β    β  ββ qol_elem (convex; positive)
β    β  β  ββ index (affine; real)
β    β  β  β  ββ index (affine; real)
β    β  β  β     ββ reshape (affine; real)
β    β  β  β        ββ * (affine; real)
β    β  β  β           ββ 14884Γ12424 SparseMatrixCSC{Float64,Int64}
β    β  β  β           ββ index (affine; real)
β    β  β  β              ββ reshape (affine; real)
β    β  β  β                 ββ 122Γ122 real variable (id: 6461239074267313883)
β    β  β  ββ [1.0]
...
``````

Iβm relatively new to both Julia and Convex.jl so itβs possible that Iβm doing something wrong here, but I cannot see any issue in the codeβ¦

Thanks for any suggestion!

Hereβs a complete working example for reproducing the output above:

``````import Convex;
Convex.MAXDIGITS[] = 32;  # show IDs in full
Convex.MAXWIDTH[] = 120;
Convex.MAXDEPTH[] = 12;

using SparseArrays;
using Dates: now;

# parameters controlling the size of the problem
const N = 5;
const r = 10;
const m = 2;
const s = 4*r + 1;

# generate fake data
Q = randn(2r+1,2r+1);

y = zeros(s,s,N);
for t = 1:N
y[:, :, t] .= sprand(Float64, s, s, 0.15);
end

# aux functions
function conv2d(Q, x)
@debug "Entering conv2d(Q,x): " x
Q_hat = make_conv2d_matrix(Q, size(x));
return conv2d(Q_hat, size(Q), x);
end  # conv2d/2

function conv2d(Q_hat, Q_size, x)
m, n = Q_size;
k, l = size(x);
C_rows = m + k - 1;
C_cols = n + l -1;

constraints = [
];

C_vector = Q_hat * x_vector;
C = reshape(C_vector, C_rows, C_cols);

return (C[m:k, n:l], constraints)
end  # conv2d/3

function make_conv2d_matrix(Q, sz::Tuple{Int, Int})
m, n = size(Q);
k, l = sz;

Z_rows = m+k-1;
Z_cols = n+l-1;

return make_conv1d_matrix(Q_vec, (l-1)*(m+k-1)+k);
end  # make_conv2D_matrix

function make_conv1d_matrix(v::AbstractVector, n::Int)
m = length(v);

# It is much more efficient to construct sparse matrices
# this way rather than starting from `spzeros` and indexing into it.
Is = Int[]
Js = Int[]
Vs = eltype(v)[]
for i = 1:n
for (ind, j) = enumerate(i:m+i-1:i)
push!(Is, i)
push!(Js, j)
push!(Vs, v[ind])
end
end
return sparse(Is, Js, Vs, m+n-1, n);
end  # make_conv1D_matrix

# main
function main()
x = [Convex.Variable(m*s+2r, m*s+2r) for _ in 1:N];
for t in 1:N
@debug "x[\$(t)].id_hash=\$(Convex.show_id(x[t]))"
end

Β± = [-1.0, +1.0];  # well, Julia does allow crazy variable names
total = 0;
for t in 1:N
@info "Starting computation of constraint A[\$(t)] at \$(now()) ..."
# 1. introduce sign perturbation `R`
#a1 = x[1] + (rand(Β±, size(x[t])) .* x[t]);
a1 = (rand(Β±, size(x[t])) .* x[t]);
# 2. 2D-convolution with PSF matrix `Q`
a2, cs = conv2d(Q, a1);
@debug "Result of conv2d(Q,x): " a2
# 3. build contraint
for i in 1:s, j in 1:s
total += a2[i,j] * a2[i,j];
end
end;  # for t
constraint = (total < 42);
@debug "final constraint is: " constraint
end  # main

main()
``````

Edit: Simplified MWE a bit.

I donβt think they are all the same, itβs just that the same variable appears many times before another variable shows up. If you set

``````Convex.MAXWIDTH[] = typemax(Int);
Convex.MAXDEPTH[] = typemax(Int);
``````

to unrestrict the amount of printing, and then print the final `constraint`, youβll see other ids too if you scroll down enough.

As a possibly nicer way to check this, we can use the fact that Convex.jlβs objects support the AbstractTree interface. First, we modify `main` to return `constraint`, and then run

``````constraint = main()

using AbstractTrees
counts = Dict()
for node in AbstractTrees.Leaves(constraint)
if node isa Convex.Variable
current_count = get(counts, node.id_hash, 0)
counts[node.id_hash] = current_count + 1
end
end
``````

which yields

``````julia> counts
Dict{Any,Any} with 5 entries:
0x21104573583c0ed2 => 1681
0x3c713c740f42a9bb => 1681
0xf335f373fd14a1c0 => 1681
0x764bd22d8542bb71 => 1681
0x94d7782e4003012a => 1681
``````

That is, weβve found 5 different variables, which each appear 1681 times in `constraints`. Note I used `AbstractTrees.Leaves` which only iterates over the βleavesβ of the tree, meaning the nodes with no children, since this is where variables appear; in other cases you might want a depth-first search or similar (shown in the linked docs).

1 Like

Many thanks @ericphanson ! youβre right, I was not printing enough of the tree. (And thanks for the `AbstractTrees` tip, itβs indeed much better than perusing logs.)

1 Like