Standard Deviation in Custom backtest metric

I'm trying to calculate Van Tharp's SQN as a custom backtest metric.
I don't believe it's the best objective function but it's more a way to learn AFL and AB.

This code doesn't work because I guess I have a problem calculating the standard deviation.
You can see that I calculate it on an array "trades" that I manually created to be sure it's properly populated.

Problem is that "stdrmultiples" variable is always 0.

Do you see any problem in my syntax there that would prevent the proper calculation of the standard deviation on array "trades"?

/* Now custom-backtest procedure follows */ 
if( Status("action") == actionPortfolio ) 
    bo = GetBacktesterObject(); 

    bo.Backtest(1); // run default backtest procedure 
    st = bo.GetPerformanceStats(0); // get stats for all trades 
    averageRisk = st.GetValue("LosersAvgLoss"); // Average loss = 1R

   SumProfitPerRisk = 0; 
   NumTrades = 0; 

   // iterate through closed trades first 
   for( trade = bo.GetFirstTrade(); trade; trade = bo.GetNextTrade() ) 
       // Avg loss = LosersAvgLoss
       RMultiple = trade.GetProfit()/ abs(averageRisk); // Rmultipe per trade
       SumProfitPerRisk = SumProfitPerRisk + RMultiple; // Summ of all R multiple
       trade.AddCustomMetric("R-Multiple", RMultiple  ); 
    trades = 0;
    trades[ 0 ] = 1;
    trades[ 1 ] = 11;
    trades[ 2 ] = 41;
    trades[ 3 ] = 21;
	_TRACE("trades" + trades[ 3 ]); // 21

    expectancy = SumProfitPerRisk / NumTrades; // Average of sum of R multiples = Expectancy.

    bo.AddCustomMetric( "Expectancy (per risk)", expectancy ); 

    stdrmultiples = StDev( trades , 4 , False );
    _TRACE("stdrmultiples" + stdrmultiples); // 0

    MSQN = ( expectancy / stdrmultiples ) * ( sqrt(4) );
    bo.AddCustomMetric( "stddev", MSQN ); 



Found this: Trying to use stdev in a custom metric - I get no output from stdev but others work

Should I assume that standard deviation function doesn't work to calculate custom metrics?
If so, can I know why?

It's not that the Standard Deviation function "doesn't work", it's that it's not a good match for what you're trying to do. The StDev() function takes an array and a lookback period as input, and it produces a new array as output. In the CBT, all arrays are the same size, with one element per bar in the backtest date range. In your example, you only used 4 of the N bars of the trades array, so the standard deviation value that you want is in the fourth element of the returned array (i.e. index 3). But what if your backtest had produced more trades than there were bars? You wouldn't have room to store all your values.

The method described by @fxshrat in the post you linked to does not require that you put values (R multiples in this case) into an array first, because it accumulates the values needed to calculate Standard Deviation as they are encountered. This is a far superior way to approach the problem.

1 Like