Optimizing speed

While doing walk forward optimizing with several parameters I find that 1 hourly bars with primary 1 minute bars is slower than having the same data that only consist of 1 hour bars. The difference here is for example running a couple of hours with hourly data of 1 minute vs 30 minutes with true hourly data.

Is there a way to avoid this delay other than making new data?
Thanks

http://www.amibroker.com/guide/x_performance.html

1 Like

Yes I know.
I optimize on 5 min till daily data.
So I rather would like 1 record in Amibroker.
My base data is 1 minut.
I just import data from Metatrader into Amibroker.

with 1 minute base data and daily bars optimizing every step takes a couple of seconds
with 1 minute base data and minute optimizing every step also takes a couple of seconds.
but with the same daily bars imported in Amibroker, in one second it runs about 100 times a second.
So it's about 300 times faster with native daily data.

I had thought that making daily data for optimizing from a 1 minute Amibroker database would be a one time operation but it appears to be different. I hope I miss something
Thanks

I hope that you are not using TimeFrame functions instead of choosing proper Periodicity in Analysis settings?

1

a quote from : https://www.amibroker.com/guide/h_timeframe.html

IMPORTANT: TimeFrame functions are NOT intended to replace Periodicity setting. To switch periodicity/interval you should only use Periodicity setting. TimeFrame functions are ONLY for formulas that MIX many different intervals at once in single formula.

http://www.amibroker.com/guide/w_settings.html


Besides always perform your backtests/optimizations on a local database:

2

2 Likes

Thanks for your input.
Yes I do it right.
I would expect that even with 1 minute data the conversion to daily data is a
one time calculation and that Amibroker after that would run at the same speed as with
true daily data. But it is at least hundreds times slower per calculation compared
to 20 years minute data of 1 minute vs 20 year true daily data.

See again http://www.amibroker.com/guide/x_performance.html and KB http://www.amibroker.com/kb/2017/10/06/limits-of-multithreading/ what is required to run the code. Your code runs at exactly same speed if number of bars (BarCount) is the same.

If BarCount is higher it will obviously run slower (more memory cells to process). Memory access speed is not linear vs size. CPU on-chip cache is typically 10x faster than RAM.

if you are running code in non-native interval there is extra step of data compression from native to desired interval, but it is nowhere near to the figures you bring up. Again look at http://www.amibroker.com/kb/2017/10/06/limits-of-multithreading/ it explains that you should look at 'data' figure in the Info tab

2 Likes

Is this a one time operation before starting optimizing or does it need
to do the data compression for every parameter setting which can be
easily 100k times?
It seems to be the last when timing it.

I have measured the time differences and it is really >100X.
It is optimizing a portfolio of 6 tickers 20 year of minute base data.
I understand if I optimize on minut data it could be that slow but when
compressing to daily I expected it would be a bit near to primary daily data.

Is there another way to get it faster without making different primary
data sets like, 1 hour, 4 hour, daily?

Thanks

All answers are already in the docs and KB article I quoted, including precise instruction how to find out how much time is consumed by every part of the process. I spent whole day writing this KB and I am not big fan of repeating myself so often. As I wrote in the KB, the devil is in the details (settings, RAM size, opt mode) and depending on those details compression will be done once or more per optimization.

1 Like

Yes I know and I did find out it is very slow in this particular case.
I hope you can improve it in the future.
The documentation is very good but that does not fix the issue I have.
I know the workaround.

Just wanted to let you know that the problem is not as bad

I did a new test with another database, now with 5 minute primary data and in this case there is no
slow down while optimizing on hourly/daily data. (like 100 times)

The "slow" database with 1 minute primary data is not slow on hourly/4 hours but
only with daily data (>100X)

I did a test on different circumstances and other data.

Both running a walk forward on Forex data of 15 year.

True daily data on a single ticker 100 runs/second
True daily data on a portfolio of 6 tickers 60 runs/second

Daily data of 5 min primary data on a single ticker 18.5 runs/second
Daily data of 5 min primary data on a portfolio of 6 tickers 2 runs/second

So in case of 5 min data it is 50 times slower than true daily data in this case

Sorry, not 50 times slower but 60/2 = 30 times slower on daily data based on 5 minute data.

Run INDIVIDUAL optimization and your timings will be totally different (Equal speed). Also you never gave the formula that you use but timings you got tell me it is apparently trivial as most time is spent not on running the formula. If you tried something remotely CPU intensive you would get totally different results too.

1 Like

Oke but I need the custom backtester and I need several tickers in a portfolio.
Optimizing on one ticker often results in a curve fitted result.
I am searching for OHLC patterns of the last 3 days and some more things.
Every day has 4 parameters (OHLC) so when searching for a pattern in 3 days
there are 12 parameters.And there are some other parameters. So I do have quite
some parameters but the Tribes optimizer has good results.

Are there compression issues with this type of optimizing?
I mean I don't have any problem with non compressed data.