Database Optimization Tool
Collector docs

Read-only PostgreSQL audit collector

The collector is designed for small teams that need performance evidence without granting a third-party tool write access.

Session-level read-only mode
Local JSON export
No automatic SQL changes

What it collects

The collector reads relation size, table scan counts, index usage, dead-row signals, pg_stat_statements query statistics, and selected PostgreSQL settings.

What it does not do

It does not create indexes, run VACUUM, change settings, install extensions, or execute optimization SQL on your database.

Recommended command

Run: npx postgresaudit collect --url "$DATABASE_URL" --out audit.json. Use a restricted user and review the JSON before upload if your company requires it.

Related topics

Use these focused guides to compare query pressure, index decisions, and maintenance signals before you change production.

FAQ

Frequently asked questions

These answers stay inside the current Database Optimization Tool product boundary: read-only collection, evidence-gated findings, and human-reviewed next steps.

Do I need superuser access to run the collector?

No. The intended workflow is a restricted PostgreSQL user with enough read access to gather statistics and export JSON locally.

What if pg_stat_statements is not enabled?

The collector still exports table and index signals, but query-level prioritization is much stronger when pg_stat_statements data is available.

Can I use this against managed PostgreSQL services?

Yes, as long as your service allows the required read-only statistics queries and your team is comfortable exporting the JSON locally.

Will the collector change production settings or run maintenance?

No. It only exports evidence for later review inside Database Optimization Tool.