I need to test 2.000 pairs of symbols from time to time, let's say from 5 to 5 minutes.
Each pair will be tested 8 times with different parameters, so that would make 16.000 tests if only one timeframe is used.
A multiple linear regression (MLR) is only the first step of the test but it is the trickiest.
An MLR is a linear regression with two dependent variables, like y = f(x1, x2), resulting in ( (x1, x2), y).
Its solution is completely different from a simple linear regression.
So, I am looking for the less CPU-hungry way to code it.
Has anyone done it yet?
If anyone could point me to a direction, that would be great.
The most common way to solve MLR is using matrix notation, inversion and multiplication. I would like to avoid that, if possible.