Architecture for taking Amibroker /IB live to large universe of stocks on intraday data?

So for anyone interested who reads this thread at a future date, here’s some limitations/issues you should know about.

  1. As already covered, but a retail subscription to the data - DTN is the best bet at this writing - with 2500 symbols per account. There are other options that will let you pull the entire market for $500-$700/mon, but I would advise against it for the reasons below.

  2. It’s already been discussed too in some of the KB, but the auto scan doesn’t necessarily run at exactly 00. You can setup a batch process in the latest build, but you’re still going to be slightly off. So be cognizant of any time sensitive stuff.

  3. The history in the database and the timeframe you’re scanning on has a huge impact on scan times.
    a. Scanning ~4400 stocks on the most recent bar (but the database has approximately 6 years of history), takes slightly over 5 minutes on a 1 minute timeframe. Scanning on a 5 minute interval takes about 3:20.
    b. Scanning ~4400 stocks on a database with only 15 months of history takes slightly over 1 minute on a 1 minute timeframe. It’s about 50 seconds on a 5 minute timeframe and 45 seconds on a 15 minute timeframe.
    c. So depending on the strategies you’re running, if you’re trying to run 1 minute data on anything with more than a year history you’re going to hard pressed because it takes over a minute to scan so you’re constantly behind. Plus if you’re 30 or 45 seconds into a 1 minute bar, then chances are you’re probably way off from your backtest anyway.
    d. As an aside here, it was also previously mentioned that obviously you’re just using OHLC data and you don’t have bid/ask data. So the thinner the stock, the less chance you have of being filled, or the more slippage you’ll experience. I already knew that going in - but this is just for anyone else reading.

  4. The above was running one strategy with 50-75 lines of code. I’m not sure how much extra time would be needed if I was trying to execute multiple strategies.

  5. More than likely, I’ll try to contain an instance of AB to running 500-1k stocks to ensure quick scanning times. Even the idea of running a scan on a 15 minute timeframe and it taking 45 seconds to complete - being 45 seconds into the next bar before firing would probably drastically alter backtest to live results.

  6. I’m going to write a script for comparing backtest results to live AT results. But, that would be a nice feature on the wishlist for future releases.

I hope this was helpful in providing quantitative numbers for what I was seeing. It was run on Amibroker 6.25 on a Core I7 5820k with 16 GB ram.

1 Like

@Term, thanks for providing those information.

… but there are many ways in which you can significantly improve your results:

  1. You can make your scans/explorations run much, much faster if you use QuickAFL in Analysis Window. Why use 1 000 000 bars if you only need 200? I wrote about it (and about using SetBarsRequired() at the end of a formula) in details in this thread:
  1. Nevertheless - in your case you should generally try to make your database as compact as possible. The less bars, the better. It makes no sense to scan live market every n-seconds or every n-minutes using a database with a 6 years history. When performing such scans/explorations you are interested in the latest changes, not those from a year ago. You overwhelm AmiBroker with lots of unnecessary data. Besides if you perform such scan/exploration on live market and subscribe all those issues (with such a long history), you will probably run out of memory.

  2. If you need to use some values which require a long history to be calculated, there is also a solution. You can import some values from other Charts/Explorations/Scans running in a different timeframe:

If you set Periodicity (in Analysis window’s Backtester settings) to 1 Minute and want to calculate 60 day average volume (and compare it to today’s volume) or 90 day ROC, you will need many thousands 1 Minute bars just to perform this single calculation - even if you use TimeFrameSet () or TimeFrameGetPrice() - because those values will be calculated using 1 Minute bars. Your formula will become slow. However you still don’t have to use that many bars. You can calculate things that you need in other Chart/Exploration/Scan (running in a Daily timeframe) - save those values to Static Variables and then easily import required values to the first scan/exploration/indicator. As soon as I started doing so, my intra-day explorations became much, much lighter and faster.

Some more appealing examples:

To calculate a daily rate of change for EUR/USD using 1 Minute data you need 1440 1 Minute bars. In case of a Daily Timeframe - only 1.

To calculate a yearly rate of change for EUR/USD using 1 Minute data you need hundreds of thousands of 1 Minute bars. In case of a Daily Timeframe - exactly 1440 times less !!

Think about speed gains you can achieve using 1440 times less bars !!!

… and that’s not all. You can achieve the same result using just 1 Yearly bar … :slight_smile:



All is covered already in the manual:

The difference between efficient use of AmiBroker and incorrect use of AmiBroker can be 1000x fold. The same way, using same C++ compiler, proficient programmer who has not only practical but also theoretical background in computer science can produce code 1000x faster than code produced by someone who has no idea what he is doing. Even very simple stuff can be coded wrong. Simple example: iterative vs recursive implementation of Fibonacci series. First has O(n) complexity the other has O(n*n) complexity.


DTN NxCore might be interesting.

Or perhaps some CQG service, like the continuum client. In the end, providers make it hard to monitor every trade-able instrument, not to mention the difficulty increasing with the data resolution. Don’t forget the idea of limited margin/cash and just because a back test allocates trades in an A-z manner. In real life fills come in in any order taking up your availible cash/margin. A best practice is to filter out and down to a smaller trade-able universe, by some “Ideal” on the slower time frame, that can then be monitored at the higher frequency. This would reduce processing and hardware strain as well.

As far as SetBarsRequired goes, it would be nice if we could in AFL set active symbols and then SetBarsRequired() for EOD and then separately for other intraday resolutions just for those active symbol in a top down, “Whittle down” manner. Perhaps that is already possible via watch lists, but I’m more thinking of feeding just those real-time and having AB use its resources just for those as well and not the whole DB. In this way each day can be different perhaps. Just thinking. Be easy on me. :thinking:

1 Like

I agree, that's probably the best solution ...

I think, there's a misunderstanding here.

  1. Amibroker doesn't refresh/feed all symbols in a database by default. It refreshes real-time only those symbols which are displayed on visible charts (according to Tools/Preferences/Intraday/Realtime chart refresh interval settings:

(If we enter 0 into this field, then it will result in chart being refreshed with every new tick - up to 10-times per sec).

.... and those Symbols which have been added to a Real time quote window

So if you have a database with 6000 Symbols, 3 charts with different Issues opened and no Symbols in a Real time quote window, only these 3 Symbols will be refreshed realtime. So AB's resources are used effectively.

Only you can make AmiBroker to refresh all symbols in a database. You can do it i.e. by running a Scan/Exploration on a watchlist containing all Symbols (when you run initial Scan/Exploration Amibroker will backfill any data holes). But those Symbols will not be automatically refreshed during the session (if they are not added to Real time quote window). They will be refreshed only when the next scan(s) is run --> on demand (when you click the Scan/Explore button) or automatically every x seconds/minutes/hours ...

You already decide which symbols are refreshed realtime and you can already have different SetBarsRequired() settings for each timeframe either using charts/indicators or scans/explorations. For example you can have 3 explorations running in a different timeframe (different Periodicity in Analysis window’s Backtester settings) and specify different SetBarsRequired() settings for each Analysis window.


A similar topic with additional information provided by Tomasz:

There is a big difference between refreshing and feeding. Every symbol that is for example in the real-time window list or was accessed at any time and stays within RT subscription limit is FEED in real-time constantly. Feeding means streaming RT updates. These updates are received from external RT source, used to produce bar data and AB gets notified with every new tick on every subscribed symbol. This happens on PLUGIN side.

These notifications
a) mark symbols as “dirty” so AmiBroker knows that it needs to get NEW data from plugin for any such symbol if it wants to display updated charts or any Analysis is run
b) cause real-time window to display new data
c) cause Time&Sales to display new data
d) trigger Alerts (EasyAlerts)

Chart refresh is different story. It can be triggered by many events including new data, scroll, resizing, timed refresh, mouse click, symbol change, etc, etc. On arrival of new data charts are refreshed immediately if time since last refresh was >0.1 sec.


Yes, of course I am aware of that. I should have used the word "feeding" instead of refreshing in most cases. I am sorry for not being precise. English is not my native language. I do my best, but sometimes it is not enough ... :worried:

The good side is however, that your replies are usually an excellent opportunity to learn something new about how AmiBroker works internally - I really appreciate it. And that is this case :slight_smile:

  1. How can this RT subscription limit be recognized if the data provider doesn't specify it?
  1. So if I understand correctly: " if the Symbol was accessed at any time and stays within RT subscription limit (but this Symbol is neither displayed on a chart nor present in the realtime quote window) is FEED in real-time constantly." means, that it is marked as "dirty" when there is new data available, but only when such Symbol is requested by (for example) a chart or Analysis window - only then the newest quotes will be downloaded/updated. Simply put: as long as this Symbol is not "requested" AmiBroker knows, that it lacks some newest data, but doesn't download it untill this newest data is really needed. Is that how it works internally?

    And as I recall, those Symbols within RT subscription limit are added/removed according to FIFO (first in, first out) rule.

  2. What is the dependence between In-memory cache size and the number of the Symbols which are FEED in real-time constantly? Only the most obvious one - that the cache size should be big enough to fit all those Symbols' quotes?

Thank you.


No. Nothing is downloaded. The quotes of streaming symbols are already there in the plugin. Ready to use immediately without downloading anything. Real-time data sources like IQFeed or eSignal stream every tick of streaming-subscribed symbol and plugin holds that data for immediate access.

This is nothing new - everything is described / documented in the ADK (plugin development kit)

RT subscription limit is known. IB, eSignal and IQFeed have known limits and IQFeed even has methods to query for subscription limits.

Cache size and RT subscription limits have nothing common.

1 Like

I completely agree! This is great stuff and I wish there was more of it frankly. The knowledge of how the “Gears” work aid in going from playing with AB and conducting research, to real-time application. I would even add to that, will help keep strategies researched with the practicality of how it might actually be implemented et al.

My personal angle with AB as it were, was to maybe use it as a from end graphics and data management, storage (historical and real-time and give that access to my engines which generate my orders and manage my systems). Originally I thought that thru the SDK I could simply pass a pointer to the data that is in AB as historical, but yet updated constantly via a real-time feed. Then it would be nice for my engines to label orders, trades, risk management metrics our to AB for viewing, as well as labeling trades or orders on the applicable chart. That kind of stuff. However the notion of all symbols in AB not getting auto appended changes things a bit as I map this out in my head. From a marketing perspective, this type of ADK functionality (if it doesn’t exist now) might be a good idea, since most quants build stuff in anything from R, Python, mathlab, C++, C#, Java etc., and would love a front end to tie into.

With regards to the feeding, perhaps you might want to simply make a on/off toggle for watchlists and other category assignments, to either be as they are (off) or be fed (on)? You might even make it so you can set a interval or timer, to rotate the symbol, if let’s say the symbol exceed the feeds limits to rotate the active symbols. If the limit is 2000 symbols, set a timer for 2 minutes, then swap the most recent 2000 with the older 2000.

Just some thoughts.
Be Sweet.

You really should re-read my posts and the ADK docs. The data are auto-appended. BY THE DATA PLUGIN. See also

Data plugin has TOTAL control over data and it communicates with external data vendor APIs, subscribes to RT stream and builds (auto-appends) data to array that is passed AS POINTER in GetQuotesEx call. Everything (with examples) is in the ADK documentation.

1 Like

I originally was speaking of the already built in support for feeds like esignal, IQfeed, Dtn etc. From my reading above they do not auto append data files unless things like in quote window, currently charted scan or exploration etc. When you refer to data-plug in are you referring to a data plug-in we code ourselves and implemented via the SDK? If so then all quotes are stored, no matter if the symbol is in quote window, currently charted…etc.?

Not everything that you read from somebody else than me is actually precise. Except me, everyone else is just a user. They may or might not be correct or may be partially correct, because no one except me actually coded the program. The problem is that people take my description from somewhere take 5 bits out of 10 that I wrote, fill the remaining with their “own interpretation” and then post somewhere. And information noise of half-truths is created this way.

What you “read from the above” is not (entirely) correct. In short with eSignal or IQFeed ALL symbols WITHIN subscription limit, will get updated and appended, if only you access them once. This is more or less the STANDARD way of working with ANY real-time data source as you need to SUBSCRIBE to updates. And symbol once subscribed remains subscribed and data are collected as long as you don’t EXCEED your subscription limit.

Things are A LOT more complicated that very simplified picture presented above. If I were to describe ALL inner workings of AMIBroker I would need to write a book that has 100000 pages. I don’t have time for that and I have no desire to do that because all the competition with everything handled on silver plate would steal ideas from us (they already do that).

1 Like

lol ok understood regarding debth of material and disclosing.

Just so I understand then, even if you ran an empty exploration Filter=True, which creates the subscription (an automatically subscribed watchlist might be nice to have, then we can explore logic to that list etc); Anyway, then all those symbols (post subscription) will be on autopilot, meaning initial backfill and real-time updates (10/sec each) appended price array and local storage? Then completely referenced thru ADK?

Again, this is your interpretation, not facts. The reader’s interpretation is as always skewed a bit. That is similar to a kids game called “Chinese whispers” or “Chinese telephone”. The person on one end says something but on the receiving end we get something completely different. Yes an exploration would subscribe to RT updates, but for example you don’t need to even set Filter = True. It can be set to false. Also real-time updates are appended not 10/sec but immediately (if there is 5000 ticks per second then it gets updated 5000 ticks per second). 10/sec is NOT data update rate, but the maximum frequency of single CHART redraws when triggered by plugin. But if you use other refreshes (such as mousemove) it can be 1000 per second. As I wrote, there are GAZILLIONS of details. And RT data are kept in RAM, not in “local storage” (i.e disk).


Tomasz, thanks to your replies, I know, that the part of my reply in which I wrote that after running a Scan/Exploration on some Issues, those issues are not fed in real-time after the Scan, is not true. Those symbols become subscribed and are updated - as long as they stay within RT subscription limit. Sorry for that. I don’t want to misinform anybody, so you can edit or delete the part of my reply containing wrong information. I hope, that some other things that I have mentioned are useful to some users…

… but let me briefly explain why I have written it. I’ve been scanning the market (using Statica plugin) for many years. I always start with a pre-scan to allow AmiBroker subscribing issues and filling data holes. Only when this phase is completed (it takes some time) I am ready to run live scans during the session. The session starts at 9 a.m. and the market closes at 5 p.m. What I have observed right from the beginning, is that as long as those scans/explorations are run, all symbols’ quotes are updated and when I save the database, the quotes are complete. But if my last scan is run for example at 4 p.m. (and I save database at 5 p.m. when the market closes) when I later review the charts/quotes after the session (offline) most symbols’ history ends at 4 p.m. (their quotes have not been updated after 4 p.m.) What might be the reason for that? Is it because:

  1. RT subscription limit is small and most issues are not within it


  1. The plugin downloads and collects all the necessary data, but because those symbols have not been requested by AmiBroker to the end of the session, they have not been saved in AmiBroker’s database.

I have asked “How can RT subscription limit be recognized if the data provider doesn’t specify it?” because I can’t find any information concerning Statica’s limit.

One more question. I have not used eSignal or IQFeed, so forgive me my ignorance, but what happens if a user who uses a plugin with RT subscription limit let’s say 1000 issues tries to scan 2000 symbols? He should be able to scan the first 1000 issues (because they are within the RT subscription limit) but when he exceeds this limit, (according to the FIFO rule) those 1000 issues should be gradually removed from the RT subscription list and replaced by the newest issues… So the first 1000 symbols should be replaced by the newest 1000 symbols. If that is the case, is the user able to get the latest quotes of all those 2000 issues when running one scan? Important remark: I am aware of the fact, that the latest quotes for those issues which are outside the RT subscription limit may not be available at once (but they should be available in subsequent scans even if the Wait for backfill option is not available). Of course I assume that the database is updated regularly (for example every day) and misses only the newest quotes. A quote from Tomasz’s reply from another thread (I have already provided the link to it):

Wait for backfill flag instructs Analysis window to wait until IQFeed sends all requested data. Without that Analysis would progress with your existing data (stored already in the database), without waiting for update. The update will eventually arrive later and will be included in subsequent scans.

Thank you

Statica is NOT our plugin. It is 3rd party development that is done without our supervision. We are not responsible for 3rd party developments. Everything what I wrote applies only to official plugins that are developed by us.

That’s not how it typically works. It’s a hard fixed limit. However, it’s theoretically possible that a plugin could unsubscribe – if the data feed supports that – to free up the slots for new tickers during a scan. But, that would be gaming the limits and its unlikely to be tolerated or officially supported.

All our plugins indeed unsubscribe ‘oldest’ symbols once limit is hit, but it is not really reliable and as @TraderX noted it is not officially supported by data vendors. It is highly recommended to stay within limits.

Hello Term,

I found this thread that you started very interesting and your last post - the summary of issues for future reference - most useful. The same applies to the comments by Tomasz, Milosz and Sean as well. I consider developing a RT system myself and these issues have popped up on my list too. The exchange here helped me to organize my thoughts.

I am wondering whether you continued with your project. If you did, could you give an update now (a year later) on what happened and what were the most important findings for you? What did you do (which data feed, how many symbols etc.) and what challenges, external sources (codes, developers etc.) have you come across during the process?