Hi all, I’ve been trying to work around this problem, and I need a little bit of help.
I have a datarme that looks something like this:
|Y|X1|X2|X3|X4|X5|
|----------------|
|1|22|23|25|98|23|
|8|52|23|15|28|63|
And so on, and I want to evaluate all the linear models between Y and the combinations of 2 of the other columns, for that I am doing this:
list_comb = collect(combinations(names(list_cols, 2)) #where list_cols contains all the names expect Y
But then I really don’t know how to continue.
I thought about using a loop like this:
for i in lista_comb
lm(@formula(y~ i), df)
end
But this gives the following error: ArgumentError: There isn't a variable called 'i' in your data; the nearest names appear to be:
(this is also inside a let block).
I later want to evaluate the r2 of every combination, so I can use the best one, and I know this is easily done with a list and an append!, but I need some help with this first part.
Thanks a lot!