Walk forward taking too long

does anyone know how to reduce walk forward time? Ihave a pretty grunty server working on my formula


@mbirrell I don't know about what data you have but you may have thousands of symbols,

Perhaps try working on a smaller number of symbols (like a selected watch list)?

yes true and good point... however I am getting only around 5 trades a month and I don't want to reduce my profit (I am greedy :slight_smile:)

If you're really keen you can rent supercomputers from universities and other places. eg...


The screenshot looks like you are using formula with inefficiently written custom backtester code. So, if you want help show the code, see: How to ask a good question

Also make sure that you have set in-memory data cache large enough to accommodate ALL symbols in test (Tools->Preferences, "Data").

See http://www.amibroker.com/guide/x_performance.html and

Ok... the anwer is I got 5 SSD drives with fast write performance and configured them in a raid 5 config. I went through the config of the bios and set it for maximum performance. I tweeked some settings in Amibroker as per the doco and wow it made a huge difference. One step of the walk forward went from around 30 minutes to less than 5.... an other example of read the doco. So thank you all for you great suggestions and speedy replies.

Still most gains can usually be achieved from improving the formula.

A method I use is to use the Task Manager. Go to the Details tab and select Broker.exe. Right click on it and change Set Priority to Above normal. Might not work for you if you don't have access to the server.Above%20normal

Setting priority has close to zero impact unless machine is very busy with other processes, but then it would be better to reduce the number of other processes.

I set it to high but it didn't make any difference... that said it is really fast now.

As I wrote - show the code and timings described here http://www.amibroker.com/kb/2017/10/06/limits-of-multithreading/. The screenshots you are showing suggest that the code is sub-optimum.

Thanks reading through it now...

Here is what I see in the info just FYI...
Optimization started.
Completed in 372.29 seconds. Number of rows: 1779
( Timings: data: 238.27, setup: 0.67, afl: 1171.41, job: 4744.92, lock: 4430.70, pbt: 0.05, UI thread: 238.99, worker threads: 5916.33/1485.64 )

As you can see most of the time is spent acquiring data. Out of 372 seconds total time, 238 seconds (64%) was spent reading the data from database and preparing them. The formula executes very quickly and there is no special room for improvement there. You may consider going to the Settings and choosing Use QuickAFL if you did not do that already. That would lower the amount of data processed to selected range. Also in Tools->Performance Monitor make sure that "symbols in the cache" during optimization is equal or greater than the number of symbols under test.

If you walk forward on hourly data with 1 min primary data you
could try to export as 1 hour data and then try with 60 min
primary data. I have had big improvements when doing that.
It takes time to compress from 1 to 60 min which you can save
if you have a time intensive run on a big portfolio.

Thanks... will have a fiddle with that as soon as I have finished going over the light reading :nerd_face: that Tomasz has pointed me to. So much faster than my desktop. It really gets the job done way faster than I need it to.