Unusual behaviour using ODBC plugin 1.7

I’m trying to pull-in some data from external sources that are currently stored in MySQL, via an AFL that’s run periodically by Scan – the AFL saves the incoming data to StaticVar’s, so there’s little impact on scrolling the price chart, however, the following issues have come up in the past few days, and I’m having trouble finding the cause. I’d appreciate it if anyone can suggest what to look at next.

Data _ is _ actually being returned from MySQL to AB via the ODBC plugin’s odbcOpenDatabase() and odbcGetArraySQL() functions, however, there are problems with what comes back:

  1. Even though there are ~70,000 rows in the MySQL table, they get summarised down to ~14,000 1-min “bars” by the select … statement, and all the data prior to 16/Aug/2019 00:00:00 is missing (~1400 bars) – there should be data from 15/Aug/2019 00:00:00 onwards.
  2. The data points being displayed are all identical: whether looking at it on the chart (with “Data Window”), or via the debugger!?

I’m using the latest version of AB, 6.31.0 x64 saving quotes to a Tick database with local storage, using real-time data from IQFeed plugin, and the latest version of the ODBC plugin, Ver 1.7 x64:

Here’s a sample of what the raw data looks like in MySQL:

  • “dtmEvent” is an SQL DATETIME(6), which holds the actual timestamp of the event.
  • “dtmEventOneMin” is an SQL DATETIME, which holds the same info as dtmEvent, simply truncated at the minute, ie no seconds. This column is essentially redundant, as it holds the same info as dtmEvent, but is mainly used for query execution speed and alignment with 1-min intervals in AB – the alternative would be to use a function “on the fly” in the query, which adds processing time.
  • “Measure” is an SQL FLOAT, which holds some measurement that took place at the time of the event, could be anything!

Note: none of this data is stock-specific, so specifying “Symbol” in the query isn’t necessary.

This is the SQL query and the results that I ideally want to pass back to AB:

use testarea ;

select count(*) as 'numEvt', sum(Measure) as 'sumEvt', dtmEventOneMin
	from events
	group by dtmEventOneMin
	order by dtmEventOneMin asc


The “Analysis settings” to ensure alignment of the Scan results with the on-chart symbol:

Here’s the AFL code to produce the aggregated values at 1-min intervals, which are subsets of the query above:

sqlEvtNum = "select count(*) as ‘numEvt’, dtmEventOneMin from events group by dtmEventOneMin order by dtmEventOneMin asc" ;
sqlEvtMeas = "select sum(Measure) as ‘sumMeas’, dtmEventOneMin from events group by dtmEventOneMin order by dtmEventOneMin asc" ;
dtaEvtNum = odbcGetArraySQL(sqlEvtNum) ;
dtaEvtMeas = odbcGetArraySQL(sqlEvtMeas) ;

This is what the chart looks like on 1-min interval:

And, what the data looks like in the debugger:

There should be some data from 15/Aug/2019 00:00:00!?:

… and now at the boundary between 15 & 16/Aug, there is data, but it’s all the same:

ODBC plugin is provided free of charge, "as is" without support of any kind. It is intended for users who are experienced enough to solve their own problems. It just works. It is also supplied with FULL SOURCE CODE, so you can modify it to suit your needs.

For users who can't figure out the docs and work with source code it is advised to just IMPORT the data into AmiBroker native database which is orders of magnitude faster than using ANY ODBC database.
ODBC databases, no matter which one you use, are NOT suited for tick data.
They are simply too SLOW. Regardless of what manufacturer of DB engine says.

The only way to work with TICK data is to use AmiBroker native database.

Having said that, you need to know that any data pulled from plugin so the number of bars retrieved is MATCHED and ALIGNED to the bars that you already have in your database for current symbol. So if your current symbol has 14000 bars, then odbcGetArraySQL will ALSO retrieve 14000 bars and attempt to match date/times from your table to EXISTING DATA for current symbol.

1 Like

Thanks @Tomasz for the pointer on where to look next, and for the cautionary advice on using external databases, particularly for real-time data, which I also saw in some of your other posts from a while ago.