In backtesting over the last 20 years it performs "good enough" for my purposes, a CAR/MDD greater than 1 and a maximum drawdown under 20%. However I'd like some thoughts from this incredible group. When I run an optimization using CMAE it takes about 3 minutes to give me some decent variables to try, and I do see several combinations that are very similar that all perform quite well. Ultimately I'd like to perform an exhaustive optimization of all parameters, but last I checked my computer said it would take around a billion years. I don't imagine I'll live that long (no relation to Soros).

My question is: obviously non-exhaustive optimizations like CMAE are good enough to get us in the ballpark given a number of values, but when we have millions of permutations is it reasonable to assume that the values we get are "close" to what we would get if we did an exhaustive test? Put another way, what is the capability of the optimization functions like CMAE, SPSO, and TRIB? Is there a cut-off where I need to remove some variables and hardcode them?

Just a discussion, thanks for any replies!

Respectfully,

Tony Roylance

If you do the math and divide 3 minutes (CMA-ES time) by billion years (exhaustive optimization), you will see that smart optimizer is able to visit only very small portion of search space. Your odds in finding global minimum increase when you allow smart optimizer to work longer. The time it takes (and number of evaluations) for smart optimizer to look for solutions depends on settings. For CMA-ES you can increase "Runs" parameter (OptimizeSetOption). More details on the bottom of this article:

http://www.amibroker.com/guide/h_optimization.html

Generally speaking, the more "smooth" your search space is, the more likely smart optimizer is able to find global optimum. The most challenging are boolean parameters. They don't work well with smart optimization methods.

As to exhaustive optimization: If you have 9 variables, you can divide your number of steps by 1000000000 (one billion) if you make stepping for each variable 10x more than it is now. This will allow you to do coarse exhaustive search in more reasonable time.

Everyday hundreds of links get invalidated because people remove / move pages. I don't know why but for example Microsoft is changing their links non-stop. Probably they have too many people on their stuff and they have nothing better to do. I can't possibly spend every day on finding and fixing links. If anybody is interested a google search on old link could reveal new location.

Indeed, I found this one Tutorial: Covariance Matrix Adaptation (CMA) Evolution Strategy; and a link to another page that provides implementations of the CMA-ES and links to libraries that contain such implementation.

Tomasz, thank you for the quick informative reply! I (and I'm sure everyone else) am extremely appreciative when you answer our questions. I'll try a combination of both (increasing steps, and increasing number of runs of CMAE).

Thank you very much again.

Regards,

Tony Roylance

Note that increasing step is OK for **exhaustive** search only. CMA-ES does not need increasing step value, in fact it can work better with smaller steps because it is designed for continuous search spaces. If step is too large, CMA-ES may finish prematurely.