Loding more than 100000 symbols in memory cache

i have watchlist of 300 thousand symbols, and when i run any afl on this watchlist it takes lot of time, as per my understanding this is due to limitation on max 100 thousand symbols in memory cache. please correct me if i am wrong, also let me know if there is a wayout for this.

Try playing with Preferences window settings:

  • In-memory cache (max. symbols) - defines how many symbols data should be kept in RAM (for very fast access) - this works together with the next setting
  • In-memory cache (max. MegaBytes) - defines how many MB of RAM should be used for temporary data cache (for very fast access)

Not restricted to symbols count, many other factors kick-in. Especially, processing time of formula calculation!

Consider reading Performance tuning tips, notably with respect to cache:

Decrease the size of in-memory cache, if you are using very large databases (>2GB in disk size)

Apart from it, try searching the forum as similar topics were discussed in the past with much depth. Also without detailed elaboration, it is impossible to ascertain exact delay reasons.

1 Like

Unfortunately your question isn't clear enough and does not provide all necessary details to give you an answer. Please follow this advice: How to ask a good question

Your post is seriously lacking details:

  1. What is the formula?
  2. What these symbols are?
  3. Do you really need to scan 300K symbols all the time?
  4. How much RAM you have?
  5. How many GB is the database?

The cache is working on FIFO basis. Frankly I doubt if you really need 300K symbols to be scanned all the time.
Cache needs to fit in about half of RAM.

  1. We were scanning the undermentioned formula for all the symbols in database
    X = ( O + H + L + C) / 4;

  2. Symbols are of SPX Options and each symbol contain data for 30 trading days.
    No. Of Symbols in DB ~ 300,000 and we have 1 minute database

  3. Database Period is 2020 - 2023 (till date),we need to see backtest result for this period, so we
    have to scan / backtest on the entire database

  4. Our Gig is RyzenThreadripper 3990X (64C / 128T) 256 GB RAM 2 TB NVMe

  5. Database is nearly 120 GB (the size of database folder in AmiBroker directory)

Hi, here is a discussion similar to yours
https://forum.amibroker.com/t/import-wizard-for-huge-files-5-million-symbols-2-5gb-into-230gb-database/27760

OK, I did not think that people would use machines with 256GB of RAM, it is not really "mainstream". I guess it is time to bump up the limits.

This topic was automatically closed 100 days after the last reply. New replies are no longer allowed.