LoadError: NLconstraint doesn't recognize scalar values

I am trying to create several linear and nonlinear constraints in Julia JuMP. I know that @NLconstraint only works with scalar values and have been mindful to use only scalar values. The issue in my code is that I am working with variable elements through indexing but the solver thinks it is an array.

Below is my simplified code:

using JuMP
using Ipopt

m = Model(solver=IpoptSolver())

@variable(m, H[i=1:J,j=1:Nt]) 
@variable(m, x[i=1:E,j=1:Nt]) 
@variable(m, L[i=1:V,j=1:Nt]>=0) 

for e=1:E
    for t=1:Nt
        dev[i=1:E,1] = A12*H[1:J,t]+A10*Hf;
        if e in V_index 
            v = find(V_index -> V_index == e,V_index)
            @constraint(m, dev[e,1].==L[v,t])


where A12, A10 are matrices and Hf, V_index are column vectors.

The line


only works with the ‘.’ before the equal signs. Else I get ‘LoadError: The operators <=, >=, and == can only be used to specify scalar constraints. If you are trying to add a vectorized constraint, use the element-wise dot comparison operators (.<=, .>=, or .==) instead’. The dev expression is the sum of two elements in the H variable, i.e., H[1,1] - H[2,1] and not a vector. Why is the solver forcing me to make the equality comparison element-wise?

The error I get for the NLconstraint is:

ERROR: LoadError: MethodError: no method matching parseNLExpr_runtime(::JuMP.Model, ::JuMP.GenericAffExpr{Float64,JuMP.Variable}, ::Array{ReverseDiffSparse.NodeData,1}, ::Int64, ::Array{Float64 ,1})
Closest candidates are:
parseNLExpr_runtime(::JuMP.Model, ::Number, ::Any, ::Any, ::Any) at C:\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\JuliaPro-\pkgs-\v0.6\JuMP
parseNLExpr_runtime(::JuMP.Model, ::JuMP.Variable, ::Any, ::Any, ::Any) at C:\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\JuliaPro-\pkgs-\v0.
parseNLExpr_runtime(::JuMP.Model, ::JuMP.NonlinearExpression, ::Any, ::Any, ::Any) at C:\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\JuliaPro-\pkgs-\v0.6\JuMP\src\parsenlp.jl:208

I have also tried expressing dev as a @NLexpression. Any help would be appreciated.