I’m looking for a simple to use single variable least absolute deviation linear regression library. My problem is modeled by a single variable function with small set (less than 32) of observations. I want to get a linear approximation of it using least absolute deviation.
something like this perhaps?
(T,K) = (size(x,1),size(x,2))
b_old = zeros(K)
b = fill(1e+6,K) .+ prec
wii = ones(T)
while maximum(abs.(b - b_old)) > prec
b_old = copy(b)
xw = x.*wii
b = (xw'x)\(xw'y) #solve (xw'x)b=xw'y for b
u = y - x*b
wii = min.(1.0./abs.(u),1/epsu) #Amemiya p 78, since 1/abs(u) can be infinite
end
yhat = x*b #fitted values
u = y - yhat
Sxx = x'x/T #Sum{xt)*x(t)'}/T
h = 1.06*std(u)/T^0.2 #kernel density value at u=0
Kh = exp.(-0.5*((u/h).^2))/sqrt(2*pi) #gaussian kernel, with "std" = 1
f0 = mean(Kh)/h
Covb = f0^(-2)*inv(Sxx)/(4*T) #Cov(b)
Thanks, I will give it a try. What are the dependencies?
you need to do using Statistics
(for the mean
and std
calculations). It’s one of the standard libraries, so no installation is required.
Before running, the code you need to define y
(vector with dependent variable), x
(matrix of regressors), prec
(perhaps 1e-5) and epsu
(perhaps 1e-6)
What’s matrix of regressors? I’m not familiar with statistical lingo. Let’s say my observations are 1->3, 2->4, 3->7. Do I define:
x = [1, 2, 3]
y = [3, 4, 7]
How do I get predicted values?
The code wants x
to be a matrix, so do x = [1 2 3]'
. Then you find the fitted values in yhat
It may be overkill here, but is there a succinct way to do it with Flux?
I have a bit of a problem with the results. For:
x = [1 2 3 4]'
y = [0 1 0 1]'
I’m getting yhat
:
4×1 Array{Float64,2}:
0.24999448434228871
0.49998896868457743
0.7499834530268661
0.9999779373691549
where sum(abs.(yhat-y))
(LAD) is 1.5000110313154225
. But
y0 = [0 1/3 2/3 1]'
provides much better approximation with LAD of 1.3333333333333335
y0 = [0 1/3 2/3 1]’
is what you get when you also include a constant among the regressors. You get this by changing the matrix of regressors to
x = [1 1;
1 2;
1 3;
1 4]
Thank you. It works now.