Hacker News new | past | comments | ask | show | jobs | submit
Postgres as a search backend is one of those decisions that looks wrong on paper but works really well in practice. tsvector handles full-text, pg_trgm does fuzzy matching, pgvector covers semantic — and you don't need to babysit an Elasticsearch cluster or worry about sync lag.

The part that's easy to overlook: your search index is transactionally consistent with everything else. No stale results because some background sync job fell over at 3am.

With 3000+ schemas I'd keep an eye on GIN index bloat. The per-index overhead across that many schemas adds up and autovac has trouble keeping pace.

is this AI?
loading story #47219678