Is it possible to solve problem for subset of parameters?

I have a dataset which is big and I already have written the model whith the following format:

@constraint(model, f  in F],
    sum(x[(i,j,k)] for (i,j,k) in families[f] if i == pref[f]) == 1) # families[f] is a dictionary mapping from f to vector combinations of (i,j,k)
# etc ...

I have around1M data for f and now would like to run the model for a subset of f with about 100 data points at random. Howewer I cannot simply put a line like f = f[1:100] and run the model. because everything is written in dictionary format I got Key error.
It takes very long to compute those dictionaries because of so many combinations, this is why I have already calculated all dictionaries once beforehand, saved them, and just load them as a whole to solve the model. But now am not able to randomly or orderly select a subset and run the model for smaller data. Is there any way around this?

You mean, you want to add only a subset of the constraints? Or also all of the variables etc.

There’s nothing built into JuMP to do this. But you can achieve the result by writing something like:

for f in F
    if should_add(f)
        @constraint(model, sum(x[(i,j,k)] for (i,j,k) in families[f] if i == pref[f]) == 1)
    end
end

Thanks @odow I want to just downsize the data and then based on the suset of data run the model with all of the constraints and all variables. My problem is that because all things is written in Dict and Tuple form, when I do F = F[1:20] to work with 20 first element as subset of F (instead of 1M data) it results in dimentionality mismatch or sometimes KeyError. For instance, let suppose I do:

F = F[1:20]  # a dictionary that map families to vector of tuples (i,j,f)  where i,j is a location
H = H[1:10]  # a dictionary that map houses to a vector of tuple of locations
FH = [(f,h) for f in F for h in H if (f,h) in FH] # this downsize the list of tuples for family-house pair 

Now when I run the model it returns many key error over here and over there since there are some dictionary inside some constraints that has a key whith downsized “F” or “H” but the values are still comes from the bigger 1M data.

Now when I run the model it returns many key error over here and over there since there are some dictionary inside some constraints that has a key whith downsized “F” or “H” but the values are still comes from the bigger 1M data.

You’ll have to fix those as well? I don’t really understand the question. It’s hard to offer advice without a reproducible example.