Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save danieleparazza/cb271e44a2f7bd58d546eaf836a0605d to your computer and use it in GitHub Desktop.
Save danieleparazza/cb271e44a2f7bd58d546eaf836a0605d to your computer and use it in GitHub Desktop.
Home Assistant statistic database migration from SQLite to PostgreSQL

๐Ÿ› ๏ธ Home Assistant History Migration: SQLite โ†’ PostgreSQL 17 on TrueNAS (Docker)

This guide helps you migrate full Home Assistant history from the default home-assistant_v2.db (SQLite) to a PostgreSQL 17 instance deployed via TrueNAS SCALE Apps.

It supports:

  • โœ… statistics_meta, statistics, statistics_short_term
  • โœ… recorder_runs, schema_changes
  • โœ… event_types, states_meta, event_data, state_attributes, events, states

With full support for foreign keys, batching, and progress logging.

NOTE this guide has been written after the whole process was completed so there may be the need for some slight adjustment.

The process was done in a TrueNAS SCALE environment but it's not strictly related to this particular OS. Effectively this guide should work for any case where you want to migrate a Home Assistant sqlite database to a PostgreSQL one.

This procedure runs inside a python docker container to minimize the time needed to setup avoiding python or libraries version conflicts and cluttering on the host system.


โš™๏ธ 1. Prerequisites

  • PostgreSQL 17 installed (or, in my specific case, Home Assistant app installed from TrueNAS SCALE Apps)
  • The PostgreSQL credentials (database name, username, password)
  • Your SQLite file (home-assistant_v2.db) is accessible, e.g. /mnt/data/homeassistant/config/

๐Ÿณ 2. Start the Python Docker container

docker run -it --rm   --network <network_name_of_postgres>   -v /mnt/data/homeassistant/config:/config   python:3.12 bash

Replace <network_name_of_postgres> with the correct Docker network (e.g. ix-home-assistant_default).


๐Ÿ“ฆ 3. Install PostgreSQL driver

Inside the container:

pip install psycopg2-binary

๐Ÿงน 4. Full migration script with cleanup and batching

Create the script:

cat > /migrate_all.py << 'EOF'
import sqlite3
import psycopg2
from psycopg2.extras import execute_values

SQLITE_PATH = "/config/home-assistant_v2.db"
PG_CONFIG = {
    "dbname": "<your-db-name>",
    "user": "<your-db-user>",
    "password": "<your-db-password>",
    "host": "<your-postgres-host>",
    "port": "5432"
}

BATCH_SIZE = 5000

ALL_TABLES = [
    "states",
    "events",
    "state_attributes",
    "event_data",
    "states_meta",
    "event_types",
    "schema_changes",
    "recorder_runs",
    "statistics_short_term",
    "statistics",
    "statistics_meta"
]

def clean_database(pg_conn):
    print("๐Ÿงน Cleaning all tables in PostgreSQL...")
    with pg_conn.cursor() as cur:
        for table in ALL_TABLES:
            try:
                cur.execute(f"DELETE FROM {table}")
                print(f"   โจฏ {table} cleared")
            except Exception as e:
                print(f"   โš ๏ธ Error cleaning {table}: {e}")
    pg_conn.commit()
    print("โœ… Cleanup complete.\n")

def migrate_table(table_name, sqlite_cursor, pg_conn):
    print(f"\n๐Ÿ“ฆ Starting migration of: {table_name}")
    try:
        sqlite_cursor.execute(f"SELECT * FROM {table_name}")
    except Exception as e:
        print(f"โŒ Error accessing {table_name} in SQLite: {e}")
        return

    column_names = [desc[0] for desc in sqlite_cursor.description]
    columns = ", ".join([f'"{col}"' if col.lower() in ('end',) else col for col in column_names])
    count = 0

    while True:
        rows = sqlite_cursor.fetchmany(BATCH_SIZE)
        if not rows:
            break

        with pg_conn.cursor() as pg_cursor:
            try:
                insert_sql = f"INSERT INTO {table_name} ({columns}) VALUES %s ON CONFLICT DO NOTHING"
                execute_values(pg_cursor, insert_sql, rows)
                pg_conn.commit()
                count += len(rows)
                print(f"  โ†’ {count} rows migrated...")
            except Exception as e:
                pg_conn.rollback()
                print(f"โŒ Error in batch ({count} processed): {e}")
                break

    print(f"โœ… Finished migration of {table_name} ({count} rows)")

def main():
    sqlite_conn = sqlite3.connect(SQLITE_PATH)
    sqlite_cursor = sqlite_conn.cursor()
    pg_conn = psycopg2.connect(**PG_CONFIG)

    clean_database(pg_conn)

    # Order for FK constraints
    ordered_tables = [
        "event_types",
        "states_meta",
        "event_data",
        "state_attributes",
        "events",
        "states",
        "schema_changes",
        "recorder_runs",
        "statistics_meta",
        "statistics",
        "statistics_short_term"
    ]

    for table in ordered_tables:
        migrate_table(table, sqlite_cursor, pg_conn)

    sqlite_conn.close()
    pg_conn.close()
    print("\n๐ŸŽ‰ All tables migrated successfully.")

if __name__ == "__main__":
    main()
EOF

Replace the PG_CONFIG values with your actual credentials.


โ–ถ๏ธ 5. Run the migration

python /migrate_all.py

Youโ€™ll see live progress for each table every 5000 rows.


โœ… Result

Youโ€™ve successfully migrated full Home Assistant history:

  • โœ” Long-term statistics
  • โœ” Full state history and events
  • โœ” Valid foreign key references
  • โœ” Optimized for PostgreSQL

Test it via Developer Tools โ†’ History or the "Statistics" graphs.


๐Ÿ™Œ Credits

ChatGPT :D

Tested on:

  • Home Assistant Core (Docker)
  • PostgreSQL 17 (TrueNAS SCALE App)
  • Python 3.12 + psycopg2-binary

Feel free to share this Gist or extend it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment