How Luminarr Works
A single Go binary that embeds a React frontend, serves both the API and UI from the same port, and uses a plugin system for indexers, download clients, media servers, and notifications.
Architecture
Luminarr compiles into a single self-contained binary. The React frontend is embedded at build time
via Go's embed package and served as static files. The API and UI share the same HTTP port (default 8282).
Request and Background Flows
API request flow
Every API call follows the same path through the stack:
Background flow
Scheduled jobs and event-driven workflows run independently of HTTP requests:
WebSocket updates: The queue page receives live progress via WebSocket. The server pushes updates when download client state changes — no polling required on the client side.
Plugin System
Plugins implement one of four interfaces. Each plugin registers itself via an init() function
and is activated by a blank import in the main entrypoint. Settings are stored as opaque JSON per config record.
Indexer
Search(ctx, query, categories) → releases. Torznab and Newznab protocols.
DownloadClient
Add(ctx, release), Status(ctx, id), queue management. 5 clients supported. Torrent clients also implement SeedLimiter for per-indexer seed ratio/time enforcement.
MediaServer
RefreshLibrary(ctx, path), Test(ctx). Plex, Emby, Jellyfin.
Notifier
Notify(ctx, event), Test(ctx). 9 channels including custom scripts.
Adding a plugin
- Create
plugins/{kind}/{name}.go - Implement the interface
- Call
registry.Default.Register*(kind, factory)ininit() - Blank-import the package in
cmd/luminarr/main.go - Add the settings form to the React UI
Settings sanitization: Each plugin can register a sanitizer that redacts sensitive fields (passwords, tokens) from API responses. The registry calls it automatically when serializing config records.
Quality Model
Luminarr's quality model has four explicit dimensions. Unlike Radarr's Custom Formats (regex + scoring), Luminarr lets you pick what you want from dropdowns. The profile is self-documenting.
| Dimension | Values | Score weight |
|---|---|---|
| Resolution | SD, 720p, 1080p, 2160p | × 100 |
| Source | CAM, DVD, HDTV, WebRip, WebDL, Bluray, Remux | × 10 |
| Codec | Unknown, XviD, x264, x265, AV1 | × 1 |
| HDR | None, HDR10, Dolby Vision, HLG, HDR10+ | Separate flag |
Scoring formula: Score = resolution × 100 + source × 10 + codec.
Used for upgrade decisions and cutoff comparisons. HDR is tracked as a separate flag — you either want it or you don't.
Quality profiles
- Cutoff — minimum acceptable quality; anything below is always upgraded if available
- Qualities — ordered list of acceptable quality combinations
- Upgrade allowed — whether to grab a better release when one appears
- Upgrade until — stop upgrading once this quality is reached
Event System
The event bus is an in-process pub/sub system. Services publish events; registered handlers receive them in separate goroutines. No external message broker required.
| Event | When it fires |
|---|---|
grab_started | A release was sent to a download client |
grab_failed | A grab attempt failed |
download_done | Download client reports completion |
import_complete | File moved/hardlinked into library |
import_failed | Import attempt failed |
health_issue | Health check detected a problem |
health_ok | Previously failing check recovered |
Subscribers
Importer
Subscribes to download_done. Moves or hardlinks the completed file into the library, triggers media scan if ffprobe is available.
Notification Dispatcher
Subscribes to all events. Fans out to every enabled notification channel — Discord, Slack, webhook, custom scripts, etc.
Seed Enforcer
Subscribes to import_complete. Loads the indexer's seed criteria and tells the download client to enforce per-torrent ratio and time limits.
Media Server Sync
Luminarr connects to your media server for two purposes: automatic library refresh after imports, and bidirectional library comparison via the Library Sync page.
Supported servers
| Server | API format | Auth method |
|---|---|---|
| Plex | XML | X-Plex-Token header |
| Emby | JSON REST | api_key query param |
| Jellyfin | JSON REST | MediaBrowser Token= header |
Library sync data flow
The sync service compares TMDB IDs from both sides to produce a bidirectional diff:
Matching by TMDB ID only. No fuzzy title+year matching. Movies without a TMDB GUID on your media server are counted as "unmatched" and excluded from the comparison.
Database
Luminarr uses SQLite by default. Schema changes are managed with goose numbered SQL migrations that run automatically on startup. Queries are generated by sqlc — type-safe Go from plain SQL.
- Migrations — numbered SQL files in
internal/db/migrations/, applied automatically - Query layer — edit
.sqlfiles, runsqlc generate, get type-safe Go - ID strategy — UUID v4 strings for all primary keys
- No ORM — direct SQL with generated types, no abstraction overhead
Zero-config storage. The database file defaults to /config/luminarr.db in Docker (the persistent volume) or ~/.config/luminarr/luminarr.db locally. No external database server to install or manage.
Scheduled Jobs
Background jobs run on simple time.Ticker intervals. No cron syntax, no external scheduler.
| Job | Interval | What it does |
|---|---|---|
| RSS sync | 15 min | Checks all indexers for new releases matching monitored movies |
| Queue poll | 30 sec | Polls download clients for progress, publishes events on completion |
| Library scan | 6 hours | Verifies files on disk match the database, detects missing or new files |
| Metadata refresh | 24 hours | Refreshes TMDB metadata for movies that haven't been updated recently |
| Storage snapshot | 24 hours | Records library storage stats for the growth chart on the Stats page |
Building from Source
Prerequisites: Go 1.23+, Node.js 20+.
# Clone and build git clone https://github.com/luminarr/luminarr cd luminarr cd web/ui && npm install && npm run build && cd ../.. make build ./bin/luminarr # Development (two terminals) make dev # Go backend with hot reload cd web/ui && npm run dev # React dev server on :5173
The React dev server proxies /api requests to localhost:8282.
In production, the Go binary embeds the built frontend via go:embed — a single
binary with zero external dependencies.