Amibroker Binance plugin

Hi Im looking for Amibroker to Binance plaugin for data and order placement.

  1. Anybody recommend any plugin for this?
  2. i found one - Binance Data Plugin , anybody used this? How reliable this is?

Thanks

I tried it in 2021 but didn't like how it worked. Also sometimes all the historical prices would vanish. It just wasn't in the same class as "real" data vendors.

1 Like

Hi peter, Do you know any vendor for Binance data?

I haven't found one specific to Binance.

Polygon.io has 1min data which is the closest that I've found, but it's aggregated from all the exchanges so there are huge price spikes that you wouldn't see at any one exchange.

1 Like

Hi Keer,

I spoke with the developer of that plug in a few months back. Seems like a nice chap and there is a free version for you to trial to decide if you like it.

If you decide not to go down that path, then iot's dooable to get crypto data into amibroker with some engineering. DDE didn't work for me, and setting up SQLs and feeding the ODBC.dll plug in, would have required me to understand how to do that and I'll be honest, I couldn't really see the benefit over just doing it using csv's on a local machine.

Binance publishes a websocket on their back end and the data is updated every 1000ms for 1s data and 2000ms for everything above. Their REST API has libraries written in a few langauges and the python one is robust.

If you can familiarize yourself with another language (best choose a 'popular' one, such as C++ or python as there are libraries galore on github) and with the Amibroker OLE, then you can push data in via a background process, but it IS fidly and the csv files used to import can drift and grow in size and cause a headache if an exception is thrown from the script, so can drift over time, so it's important to handle exceptions appropriately. I'm getting fewer and fewer exceptions as each day and modification passes. I'm aware that Tomasz advises that although it can be done, it is far more advisable to develop your own plug in (or even better, find one from a data vendor),,.... but I simply don't have that level of skill yet. I have bought a course to learn C++ for that purpose and will get round to that, but I'm inching closer with my current approach and, it's a bit 'sunk cost' fallacy at the moment.

I've got this to work to an acceptable level, but I'm having difficulty understanding the nuances of whether I can have another instance of Amibroker (same database) open at the same time so it needs some work (I have been trying to make this work for about a year,.... I have a full time job!).

Anyway, feel free to have a play with this and reverse engineer it, feel free to make use of modify. I'm not an experienced coder and make no warranty for how safe or useful this python script is.

import websocket
import json
import pandas as pd
from datetime import datetime
import streamUSDT1
import os
import win32com.client // this library allows you to work with OLE, and make use of Amibroker objects (https://www.amibroker.com/guide/objects.html) from outside Amibroker. 
from threading import Thread //You'll need to thread your websocket object as the importer is run on a 'while' loop which will get stuck if you place it before you call your websocket, or wont run if you placve it after,... unless you thread.
import time

try:
    os.remove("C:\\Program Files\\AmiBroker\\......\\Binance\\Data\\symbolcsv\\WebSockets\\webSockImport.tdmytohlcv")//Note that 'tdmytohlcv' is a file type I created using the Amibroker importer wizard 
except:
    pass
try:
    os.remove("tempDF.csv")
except:
    pass

def on_open(_wsa):
    print("On open function")
    data = dict(
        method='SUBSCRIBE',
        id=1,
        params=streamUSDT1.dictionary
    )
    _wsa.send(json.dumps(data))

def on_message(_wsa, data):
    df = pd.read_json(data, orient='index')
    data2 = (df[0]['k'])
    df2=pd.DataFrame(data2, index = [0])
    ticker = (df2['s'][0])
    candleOpen = (df2['t'][0])
    normTimeCandleOpen = str(datetime.fromtimestamp(candleOpen/1000))
    indSpace=normTimeCandleOpen.find(' ')
    dateVal=(normTimeCandleOpen[:indSpace])
    timeVal=(normTimeCandleOpen[indSpace+1:])
    openPrice = (df2['o'][0])
    highPrice =(df2['h'][0])
    lowPrice =(df2['l'][0])
    closePrice =(df2['c'][0])
    volume =(df2['v'][0])
    titled_columns={'Ticker':ticker,
                    'Date':dateVal,
                    'Time':timeVal,
                    'Open':openPrice,
                    'High':highPrice,
                    'Low':lowPrice,
                    'Close':closePrice,
                    'Volume':volume}
    df2=pd.DataFrame(titled_columns, index=[0],columns=None)
    df2.to_csv("tempDF.csv",header=False, index=False, mode='w')//I'm not sure this step of creating a temp file is really necessary but I was having some issues with displaying headers, which will cause an exception when you later try to import the file into Amibroker. Note that what is happening, is that each ticker is a message. It's probably fine to just append, but I haven't tested that.
    df3=pd.read_csv("tempDF.csv",header=None,index_col=None)
    os.remove("tempDF.csv")
    df3.to_csv("webSockImport.tdmytohlcv",header=False, index=False, mode='a')

def run():
    print("Running Websockets")
    stream_name = 'some_name'  # this is not important
    wss = 'wss://stream.binance.com:9443/ws/%s' % stream_name
    wsa = websocket.WebSocketApp(wss, on_message=on_message, on_open=on_open)
    wsa.run_forever()

def amiImporter():
    try:
      df4=pd.read_csv("webSockImport.tdmytohlcv",header=None,index_col=None)
      os.remove("webSockImport.tdmytohlcv")
      df4.to_csv("webSockImportA.tdmytohlcv",header=False, index=False, mode='w')// We create a new file 'webSockImportA.tdmytohlcv' from 'webSockImport.tdmytohlcv' and delete 'webSockImport.tdmytohlcv' allowing a new 'webSockImport.tdmytohlcv' to be 'created a filled with data' while we import 'webSockImportA.tdmytohlcv' into Amibroker.
      AB = win32com.client.Dispatch("Broker.Application")
      time.sleep(1)
      AB.LoadDatabase("C:\\Program Files\\AmiBroker\\Databases\\CryptoWebSockTester")
      time.sleep(1)
      AB.Import(0,"C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\symbolcsv\\WebSockets\\webSockImportA.tdmytohlcv","tdmytohlcv.format")
      print("Importing Data into Amibroker")
      time.sleep(1)
      os.remove("webSockImportA.tdmytohlcv")
      AB.RefreshAll()
      time.sleep(1)
      AB.SaveDatabase()
      AB.Quit()
    except:
      print("File access conflict error")
      pass

if __name__ == '__main__':
    Thread(target=run).start()//start the websocket
    time.sleep(10)// allow it to be running and have created some data before trying to import it
    i=0
    while (i==0):
        amiImporter()//see 
        time.sleep(1)

Note that stream1USDT is a python dictionary of ticker symbols that I created.

dictionary=[
	'btcusdt@kline_5m',
	'ethusdt@kline_5m',
	'bnbusdt@kline_5m',
	'neousdt@kline_5m',
	'ltcusdt@kline_5m',
	'qtumusdt@kline_5m',
	'adausdt@kline_5m',
	'xrpusdt@kline_5m',
	'eosusdt@kline_5m',
	'tusdusdt@kline_5m',
	'iotausdt@kline_5m',...
        'othercryptoetc@kline_5m'
]

You can create the above list file like this...

import pandas as pd
import sys

file = open('streamUSDT1.py', 'w')
original_stdout = sys.stdout
with file as f:
    sys.stdout = f
    print('dictionary=[')
    file.close()
    sys.stdout = original_stdout

dfSym = pd.read_csv("C:\Program Files\AmiBroker\Databases\Crypto\WatchLists\BinancePairsUSDT.tls",header = None)
for n in range(len(dfSym)):
    symbol = dfSym[0][n]
    stream =("'"+symbol.lower()+"@kline_5m"+"'"+",")
    file = open('streamUSDT1.py', 'a')
    original_stdout = sys.stdout
    with file as f:
        sys.stdout = f
        print("\t"+stream)
        file.close()
        sys.stdout = original_stdout

file = open('streamUSDT1.py', 'r')
dataInFile = file.read()
minusLastComma = dataInFile[:-2]
file.close()
file=open('streamUSDT1.py', 'w')
original_stdout = sys.stdout
with file as f:
    sys.stdout = f
    print(minusLastComma +'\n'+']')
    file.close()
    sys.stdout = original_stdout

Interestingly, I'm not getting the file conflicts that I worried I might with;

df4=pd.read_csv("webSockImport.tdmytohlcv",header=None,index_col=None)

and in any case, the loop is immediately re run if we do, and the exception is handled.

I use 8 lists and call them run them in their own script. I chop the lists up (manually) into about 90 tickers each, as much more than that, it is difficult to subscribe to the socket. There's about 650-750 tickers on Binance if you are sticking to USDT,BTC,ETH symbol pairs, but midify as you feel the need.

Where to get your ticker symbols?
You can use this code below to get your ticker symbols (again modify it to your hearts content to suit your needs;

from binance.spot import Spot
from binance.spot import config
import pandas as pd
import eventlog
import logging
import time

logging.basicConfig(level=logging.DEBUG, filename='log.txt', filemode = 'a')
eventlog.timelogger()
eventlog.scriptid('ExInfo.py')

Client = Spot(<YOUR API KEY>,<YOUR API SECRET>)
exInfo = Client.exchange_info()//Note this endpoint is public API anyway but you'd use the above approach for all of the functions in the python libary linked to above
print("Starting")
sym=exInfo['symbols']
df = pd.DataFrame(sym)
print("Reading Dataframe")
trad = df[df['status'].str.endswith('TRADING')]
print("Writing initial Dataframe to CSV")
trad.to_csv('C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\BinancePairs.csv', index=False, mode='w')
print("Reading CSV file")
df2 =pd.read_csv('C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\BinancePairs.csv')
print("Identifying pairs where traing is allowed")
allowed = df2[df2['isSpotTradingAllowed']==True]
print("Writing pairs to csv")
allowed.to_csv('C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\BinancePairs.csv', index=False, mode='w')
print("Done! Check csv")
df3=pd.read_csv('..........\Watchlists\pairs.csv')
symList = df3['symbol']
symList.to_csv('C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\BinancePairs.csv', index=False, mode='w')
df4 = pd.read_csv('C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\BinancePairs.csv')
usdt= df4[df4['symbol'].str.endswith('USDT')]
usdt.to_csv("C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\PairsUSDT.tls", index=False, header=False)
print("USDT Pairs done")
btc= df4[df4['symbol'].str.endswith('BTC')]
btc.to_csv("C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\PairsBTC.tls", index=False, header=False)
print("BTC Pairs done")
eth= df4[df4['symbol'].str.endswith('ETH')]
eth.to_csv("C:\\Program Files\\AmiBroker\\.......\\Binance\\Data\\Tickers\\PairsETH.tls", index=False, header=False)
print("ETH Pairs done")

In fact, using the above, I'm going to create a folder containing files for each pair to specify the number of decimals for the order, as that needs to be specificed. Most are 8, but if you're trading bitcoin pairs, they can be higher, and you can get an exception from Binance.

Trading your signals, then, is easy. Put together an Amibroker .abb which captures your signals, but use explore and addColumn functions (and your filter is your buy criteria), and export the file as a CSV. Then have a python script read the csv using the 'pandas' library and call the new_order() function in your script, with the parameters pulled from your CSV, then delete the csv so it is not accidentally read later by another process. Simultanoeusly you have another websocket script listening to user data stream basically listening out for the order id. Once it hears the order id, then it replicates your stop loss (spot is more fidly than futures,... theres no native stop loss function if you don't already own a coin.

Honestly though, if I were staring the last 12-18 months again, I'd learn how to use the ADK. I've read the documentation and think I understand 80% of it. What could possibly go wrong? :woozy_face:

1 Like

does this give you tick/1s/5s candles in the chart?

This doesn't specifically and I don't think you're going to get tick data from this particualr end point, but if you were specifcally interested in tick data I would suggest you look at this endpoint instead which with a bit of work, you could easily tweak to return tick data, if that is what you are hoping for. Unless you are looking specifically at being a market maker, then I'm not certain myself why tick data would be advantageouys compared to 1s, but obviusly that is for you to decide. Also, if market maker strategies are something you are interested in, I think you'd bebetter off looking at the 'order book' endpoint. Too much work tbh, and law of diminishing returns for what I am doing, so I don't intend to develop in that direction. Feel free to copy pasta the code above and modify it to suit your needs though. Obviously I make no warranties for its effectivenes.

If you're not sure how to implement tick data, or order book yourself, and are satisfied with 1 second data, I would suggest you modify my code above, so that instead of

dictionary=[
	'btcusdt@kline_5m',
	'ethusdt@kline_5m',
	'...
]

you would just go with

dictionary=[
	'btcusdt@kline_1s',
	'ethusdt@kline_1s',
	'...
]

Also, my apologies, the above is not a dictionary, it is a list (but I haven't got around to refactorng the code yet)