PostgreSQL#
What Was Established#
PostgreSQL 16 runs as an LXC on Proxmox, serving as the central database for n8n, the monitoring pipeline, and pgvector embeddings for the wiki.
Deployment#
| Detail | Value |
|---|---|
| LXC host | postgresql |
| Container ID | 108 |
| IP | 192.168.1.57 |
| Port | 5432 |
| Version | 16.13 (Debian 16.13-1.pgdg13+1) |
| OS | Debian 13 (unprivileged LXC) |
| Disk | 4 GB (App-Storage ZFS pool) |
| RAM | 1024 MiB |
| CPU | 1 core |
| Installed via | tteck Proxmox helper scripts |
| Web UI | Adminer — not yet installed |
| SSH user | iluvatar (sudo, PermitRootLogin no) |
Databases and Users#
| Database | User | Purpose |
|---|---|---|
| homelab | homelab | n8n workflows, monitoring pipeline, pgvector wiki embeddings |
Setup commands (run as postgres user)#
CREATE DATABASE homelab;
CREATE USER homelab WITH PASSWORD '<password>';
GRANT ALL PRIVILEGES ON DATABASE homelab TO homelab;
GRANT ALL ON SCHEMA public TO homelab; -- required for n8n
\qThe GRANT ALL ON SCHEMA public step is required — without it n8n fails to start with a permissions error even though the database and user exist.
Tables#
homelab_analysis (monitoring pipeline output)#
Created 2026-04-16. Stores hourly AI-generated homelab status reports.
CREATE TABLE homelab_analysis (
id SERIAL PRIMARY KEY,
created_at TIMESTAMPTZ DEFAULT NOW(),
overall_summary TEXT,
summary_prometheus TEXT,
summary_uptime_kuma TEXT,
summary_unifi TEXT,
summary_synology TEXT,
raw_metrics JSONB
);unifi_snapshots (monitoring pipeline — hourly UniFi raw data)#
Created 2026-04-19. Stores one row per pipeline run with raw WAN counters and environment data for delta computation in the next run.
CREATE TABLE unifi_snapshots (
id SERIAL PRIMARY KEY,
recorded_at TIMESTAMPTZ DEFAULT NOW(),
wan_drops INT,
wan_downtime_seconds INT,
wan_rx_bytes BIGINT,
wan_tx_bytes BIGINT,
wan_latency_ms FLOAT,
wan_availability_pct FLOAT,
gateway_temp_c FLOAT,
gateway_cpu_pct FLOAT,
gateway_mem_pct FLOAT,
client_count INT,
vlan_data JSONB,
ap_radio_data JSONB,
switch_data JSONB
);prometheus_snapshots (monitoring pipeline — hourly Prometheus raw data)#
Created 2026-04-19. Stores full host metrics per run. No delta logic currently implemented (Prometheus uses rate queries, so cumulative deltas are less useful).
CREATE TABLE prometheus_snapshots (
id SERIAL PRIMARY KEY,
recorded_at TIMESTAMPTZ DEFAULT NOW(),
hosts JSONB
);wiki_embeddings (pgvector)#
pgvector extension installed 2026-04-20. The homelab database hosts wiki_embeddings for semantic wiki search across all wikis.
-- Extension (installed once)
CREATE EXTENSION IF NOT EXISTS vector;
-- Table schema
CREATE TABLE wiki_embeddings (
id SERIAL PRIMARY KEY,
wiki TEXT NOT NULL,
namespace TEXT NOT NULL,
file_path TEXT NOT NULL UNIQUE,
content TEXT,
summary TEXT,
embedding VECTOR(768),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- IVFFlat index for cosine similarity search
CREATE INDEX idx_wiki_embeddings_embedding ON wiki_embeddings
USING ivfflat (embedding vector_cosine_ops) WITH (lists = 100);
CREATE INDEX idx_wiki_embeddings_namespace ON wiki_embeddings (namespace);
CREATE INDEX idx_wiki_embeddings_wiki ON wiki_embeddings (wiki);Embedding model: nomic-embed-text (768-dim) via Ollama on Pavilion (http://192.168.2.192:11434/api/embeddings)
Embedding strategy for large pages: nomic-embed-text has a payload ceiling (~11KB; HTTP 500 above this). The pipeline handles large pages by:
- Splitting content into 5000-char chunks (
EMBED_CHUNK_SIZE, tunable via env var) - Embedding each chunk independently
- Averaging all chunk vectors into a single 768-dim embedding
- If > 8 chunks (
EMBED_MAX_CHUNKS), sampling beginning + middle + end before averaging
summary column: stores first 500 chars of page content for quick display without fetching full content.
Upsert pattern:
INSERT INTO wiki_embeddings (wiki, namespace, file_path, content, summary, embedding, updated_at)
VALUES (%s, %s, %s, %s, %s, %s::vector, NOW())
ON CONFLICT (file_path) DO UPDATE
SET content = EXCLUDED.content, summary = EXCLUDED.summary,
embedding = EXCLUDED.embedding, updated_at = NOW();Query pattern:
SELECT file_path, 1 - (embedding <=> '[query_vector]'::vector) AS similarity
FROM wiki_embeddings
WHERE namespace = 'general' AND wiki = 'homelab'
ORDER BY embedding <=> '[query_vector]'::vector
LIMIT 6;Pages below 0.5 similarity are weak matches.
Connection String#
postgresql://homelab:<password>@192.168.1.57:5432/homelabSSH Access#
The LXC is hardened per 2026-04-19 session:
- Root SSH login disabled:
PermitRootLogin noin/etc/ssh/sshd_config - Non-root user
iluvataradded with sudo access - Standard Proxmox shell access still available as root via Proxmox UI
Data Retention#
homelab_analysis rows are kept indefinitely. At ~21 MB per 90 days the storage cost is trivial and full history has more value than disk savings.
Related Pages#
n8n, AI-Driven Monitoring Pipeline, Wiki Pipeline Scripts, Wiki System - Architecture
Sources#
Homelab AI - 2026-04-15 · raw/conversations/chunks/2026-04-15-31-Homelab AI.json
Homelab AI - 2026-04-19 · ingested/chats/2026-04-19-31-Homelab AI.json
Homelab AI - 2026-04-20 · ingested/chats/2026-04-20-31-Homelab AI.json