AmiBroker is such a versatile tool that we can go from backtesting historical data from 30 years ago to live trading the latest tick.
Data sources too are numerous and methods for importing the data into AmiBroker vary, from RTD plug ins like the awesome universal one that @nsm51 is currently developing, to ASCII imports via wizard, to the OLE API imports.
Each import method has its pros and cons and the need for various work arounds.
For example, importing via ASCII is slow (?due to IO file operations).
JSON imports are amazing for several (even 10’s of) thousands of rows, but when importing 1000’s of symbols on 5m timeframe for 8 years, then if the dictionary is too large it’s not imported, but if it is paginated (even slowly) it is written over. I suspect this is not a limitation of the plug in nor Amibroker, but of my understanding of database design principals.
I have had AmiBroker for 3 years and I haven’t finally settled on a design for my database that suits, and I’m looking for inspiration from other users on how they suggest configuring a database.
What would I like to be able to do?
-Backtest historic 5m data for crypto and forex.
-View historical data from low to high time frames, up to monthly and use Amibroker as my charting package.
-Run a scan every 5 minutes for entry and exit signals which I then parse.
I’m certain the obstacles I am facing are not insurmountable, but I suspect also I am looking at the problem the wrong way.
How would/do you construct and Amibroker database (or would you use 2 databases, one historic, ASCII, and one for real time data?)