It sounds like you are solving a least-squares problem by the “normal-equations” approach, i.e. you are computing z = (X^T X)^{-1} X^T y to minimize the least-square error \Vert X z - y \Vert_2. If you are getting “exploding” results, it is presumably because X^T X is badly conditioned (nearly singular), in which case you have to be careful about how you solve the problem — your answer is very sensitive to roundoff errors and other errors (e.g. noise in the data).
If X^T X is badly conditioned, then inv(X'X) * (X'y)
could indeed give very different answers than (X'X) \ (X'y)
because of differences in roundoff errors — but the correct interpretation is that both approaches are giving you an inaccurate result.
A better approach would be to use X \ y
, which is equivalent to qr(X, Val(true)) \ y
— this uses a pivoted QR factorization to solve the least-squares problem, which is much more accurate than the normal-equations approach for badly conditioned X.
If you need even more control over regularization for a badly conditioned problem, you could use a pseudo-inverse (pinv
) approach. See also my explanation in another thread: Efficient way of doing linear regression - #33 by stevengj