How to setup ODBC from Excel or CSV (The Painless Way)

Instuctions for setting up an AmiBroker datasource from stock prices stored in Excel or CSV using ODBC plugin connector. This will automate pricing data update into AmiBroker -- THE PAINLESS WAY.

This method worked for me: AmiBroker Ver 6.40.4, ODBC Plugin 1.7.0

Step 1) Setup a SQ Lite 3 database and connect to AmiBroker
Follow this video https://www.youtube.com/watch?v=c7UWJ0z40kg

Step 2) Convert Excel and Upload data to the SQ Lite 3 database created in Step 1
Follow the python code at the end

Step 3) If Window Keeps Popping Up -- Select Data Source (DSN Name)

Thanks to DingoCC. This post solved the issue, after spending 10 painful hours looking for a solution

The default ODBC Connection String has DSN in Position #2 and Database in Position #3. You have to first connect this way to Select Table_Name. After clicking Retrive all symbols. Move DSN to Position #3. Database must be in Position #2. After swapping the order should be like this.

ODBC; Database=C:\your\db\filepath.db;DSN=name_setup_in_step1.....

This is the default string order.

Python Code for Step 2

import sqlite3
import pandas as pd
import os


def clean_data(file_path, tab_name):
    
    #Change your data cleaning ETL logic

    df = pd.read_excel(file_path, sheet_name=tab_name, usecols=list(range(7)))
    df.columns = ["Symbol", "Date", "Open", "High", "Low", "Close", "Volume"]
    df["Date"] = pd.to_datetime(df["Date"]).dt.strftime("%Y-%m-%d")
    
    return df


def upsert_data(df, conn, table_name):
    """
    Upsert function: insert/update rows into SQLite based on Symbol+Date.
    Table name is controlled by the table_name parameter.
    """
    cursor = conn.cursor()

    # Ensure table exists
    cursor.execute(f"""
    CREATE TABLE IF NOT EXISTS {table_name} (
        Symbol TEXT NOT NULL,
        Date TEXT NOT NULL,
        Open REAL,
        High REAL,
        Low REAL,
        Close REAL,
        Volume INTEGER,
        PRIMARY KEY (Symbol, Date)
    );
    """)

    # UPSERT statement
    sql = f"""
    INSERT INTO {table_name} (Symbol, Date, Open, High, Low, Close, Volume)
    VALUES (?, ?, ?, ?, ?, ?, ?)
    ON CONFLICT(Symbol, Date) DO UPDATE SET
        Open=excluded.Open,
        High=excluded.High,
        Low=excluded.Low,
        Close=excluded.Close,
        Volume=excluded.Volume;
    """

    # Efficient batch insert
    cursor.executemany(sql, df.values.tolist())
    conn.commit()


# -------------------------------
# Example usage with multiple files
# -------------------------------

path = r"C:\Data" #<-----Change file path
files = [
    "Excel - PxDnLoad - HK.xlsx",
    "Excel - PxDnLoad - TW.xlsx",
    "Excel - PxDnLoad - CN.xlsx",
    "Excel - PxDnLoad - Index.xlsx"
]

db_path = r"C:\Data\sqlite3.db" #<-----Change sqlite3 database path
table_name = "ohlc_d_1y"

conn = sqlite3.connect(db_path)

for file in files:
    excel_file = os.path.join(path, file)
    print(f"Processing {excel_file}...")
    df = clean_data(excel_file, "OHLC") #<-----Change tab name
    # print(df)
    upsert_data(df, conn, table_name)

conn.close()
2 Likes

If these are daily (EOD) data and using CSV format (that is plain text really), it is actually easier just to use ASCII import (you can automate it using Batch and/or OLE automation)

And it doesn't require any external tools / plugins.

Yes, Tomasz your solution is the simple way. The Batch ASCII import works very well (fully automated), and I had been using that solution. Works wonders!

I decided to try the ODBC route out of curiosity, and gave myself a black eye.

Thank you for creating such a great product.

About that “black eye”, please don’t get me wrong, I did not mean to discourage you, I appreciate you willingness to share solutions with the others and more knowledge is never a bad thing. Curiosity is also a great thing. So thank you for providing these instructions.

3 Likes

Thank you @Wealthero for sharing same, and actually I find adjusting data in sql for dividend and split and even wrong few quotes works great using sql
I have been using it recently since my broker provide sqllite db for history and intraday data, and it reduced a lot of headache for me.

And I thought I need to live with that dialogue prompting forever after trying the solution by @DingoCC it is really resolved. :smiley:

1 Like

Just for your information: I used to have an ODBC connector to store the results of a SQL query on my laptop. My server was migrated to a new one, and the technical staff mentioned that the new servers don't allow external connections to the database anymore, for security reasons. I had to create a PHP script to run the query directly on the server and an HTTP endpoint to download the data.
In many environments, the trend is to restrict direct ODBC/database access from outside the server and expose the data through APIs or scripts instead.

2 Likes

For anyone interested in using DuckDb (instead of Sqlite) to connect to AmiBroker, here are some tips for painless setup. FYI DuckDb supports full SQL and is columnar (thus faster). The setup is just as easy as Sqlite (both are file based db).

  1. After setting up DuckDb with OHLC data table, install DuckDb ODBC plugin.

Tip: You will download a zip file. Extract the file in your DuckDb location and rename extracted folder, then click on odbc_install.exe. I initially installed from the temp download folder and later moved the folder location. I ended up having to go to regedit to change the driver paths manually.

  1. In the DuckDb ODBC plugin page, it will show you how to setup the DSN.

  1. Connect AmiBroker with the DSN, by following the video for sqlite setup (see original post up top).

  2. In the ODBC Connection String (ODBC;DSN=YourName;), this is the default setup when you select the DSN. Choose the correct DuckDb table and press OK.

  1. You will get an Error Message Popup afterwards.

  1. Go back to the ODBC connection string, and remove ODBC; in the beginning and only keep the DSN info (DSN=YourName;).

The ODBC connection then worked fine for me (pop-up goes away).

If the above process sounds too complicated, use the ASCII import mentioned by Tomasz above. If you are downloading data from Eodhd.com, consider using AmiQuote or Eodhd's downloader.

P.S. My use case:

  1. Victim of MetaStock download policy change
  2. Setting up Eodhd.com datasource as replacement
  3. I chose api + code (instead of downloader programs) to automate, minimize download time and save api credits
  4. Using DuckDb to ETL the data and do price adjustments for corp actions
  5. Connecting DuckDb to AmiBroker via ODBC.
    I also tried exporting DuckDb to CSV and use ASCII import to AmiBroker. Both way works.
1 Like