Pro Tip: Making parameter entry faster

I've been working with perceptrons lately which have a lot of changing parameters. So I came up with this simple piece of code that makes entering parameters much faster.

params = ParamStr("Tab Sep Params","1	2	3	4	5	6"); // Enter your prams as a tab separated list.

x1 = Optimize("X1",StrToNum(StrExtract(params,0,separator = '	')),-10,10,1);
x2 = Optimize("X2",StrToNum(StrExtract(params,1,separator = '	')),-10,10,1);
x3 = optimize("X3",StrToNum(StrExtract(params,2,separator = '	')),-10,10,1);
x4 = optimize("X4",StrToNum(StrExtract(params,3,separator = '	')),-10,10,1);
x5 = optimize("X5",StrToNum(StrExtract(params,4,separator = '	')),-10,10,1);
x6 = optimize("X6",StrToNum(StrExtract(params,5,separator = '	')),-10,10,1);

This allows you to have a single parameter that controls as many sub parameters as you'd like.
This is helpful because the way we currently enter parameters requires a physical mouse click in order to get into the dialogue box, which takes a lot of time when you have a lot of parameter combinations to manually enter.

Now, you can use this with a comma, or a space, or any other separating character between the values - but a TAB is used here because if you copy all of the contents of an optimize report, paste that into a notepad file (or blank page in the AFL Editor), you can now simply copy the list of values you want and paste them into this single parameter!

Hopefully this helps someone out there.


It is highly recommended to use "\t" for denoting tab character.

Also remember that you really don't need parameters to store default/best values for optimization variables. AmiBroker features auto-optimization feature that automatically uses best result for default values in Optimize() calls.

It is documented in Release Notes for version 6.25:


Very cool feature @Tomasz, thanks for mentioning it here! Also, \t noted.

I've personally been having trouble consistently finding the "best" parameters in an algorithmic way during walk forward optimization. I've experimented with different custom backtesters without consistent luck. So I've resorted to a more traditional "hold out method" of optimization, which requires manually testing different values from the training set on the unseen potion of my data.

So for me, the "best" values are those that test good in the training data (but might not necessarily be the very best), AND continue to be good in the held out data. Now I wonder: :thinking: is there a function for this sort of "hold out" testing?

EDIT: Wait, maybe what you describe is what I'm looking for?? Can I take this * file with all of the best results, and automatically test each of the best results on my held out data?


You can modify file externally (either manual or programmatic way) if you wish and put any parameter combinations you want.

To answer my question, in playing with this it does not seem to solve the issue of testing different profitable combinations found in training data on held out data.

The code Tomasz posted above is nice if you are algorithmically choosing (and using) the best combination in the standard walk-forward manner.

I believe a solution for what I'd truly like would require a new "hold out" backtest method. The key difference between this method and the walk-forward method would be that all profitable combinations from the training data would then be tested against the unseen data. The "best" combination of parameters would then be chosen based on the performance of the held out portion as it relates to the training portion. Lower level would be better, allowing for user to dictate what things to compare for this ranking.

This might all be possible through SetCustomBacktestProc(), but I could imagine it getting pretty elaborate.

1 Like

My point was that you can use #include file with parameter values and that file can be programmatically generated saving you from manual entry/modification of parameters.


This topic was automatically closed 100 days after the last reply. New replies are no longer allowed.