# InfoXtractor (ix) Async, on-prem, LLM-powered structured information extraction microservice. Given a document (PDF, image, text) and a named *use case*, ix returns a structured JSON result whose shape matches the use-case schema — together with per-field provenance (OCR segment IDs, bounding boxes, cross-OCR agreement flags) that let the caller decide how much to trust each extracted value. **Status:** MVP deployed. Live on the home LAN at `http://192.168.68.42:8994` (REST API + browser UI at `/ui`). ## Web UI A minimal browser UI lives at [`http://192.168.68.42:8994/ui`](http://192.168.68.42:8994/ui): drop a PDF, pick a registered use case or define one inline, submit, see the pretty-printed result. HTMX polls the job status every 2 s until the pipeline finishes. LAN-only, no auth. - Full reference spec: [`docs/spec-core-pipeline.md`](docs/spec-core-pipeline.md) (aspirational; MVP is a strict subset) - **MVP design:** [`docs/superpowers/specs/2026-04-18-ix-mvp-design.md`](docs/superpowers/specs/2026-04-18-ix-mvp-design.md) - **Implementation plan:** [`docs/superpowers/plans/2026-04-18-ix-mvp-implementation.md`](docs/superpowers/plans/2026-04-18-ix-mvp-implementation.md) - **Deployment runbook:** [`docs/deployment.md`](docs/deployment.md) - Agent / development notes: [`AGENTS.md`](AGENTS.md) ## Principles - **On-prem always.** LLM = Ollama, OCR = local engines (Surya first). No OpenAI / Anthropic / Azure / AWS / cloud. - **Grounded extraction, not DB truth.** ix returns best-effort fields + provenance; the caller decides what to trust. - **Transport-agnostic pipeline core.** REST + Postgres-queue adapters in parallel on one job store. ## Submitting a job ```bash curl -X POST http://192.168.68.42:8994/jobs \ -H "Content-Type: application/json" \ -d '{ "use_case": "bank_statement_header", "ix_client_id": "mammon", "request_id": "some-correlation-id", "context": { "files": [{ "url": "http://paperless.local/api/documents/42/download/", "headers": {"Authorization": "Token …"} }], "texts": [""] } }' # → {"job_id":"…","ix_id":"…","status":"pending"} ``` Poll `GET /jobs/{job_id}` until `status` is `done` or `error`. Optionally pass `callback_url` to receive a webhook on completion (one-shot, no retry; polling stays authoritative). ### Ad-hoc use cases For one-offs where a registered use case doesn't exist yet, ship the schema inline: ```jsonc { "use_case": "adhoc-invoice", // free-form label (logs/metrics only) "use_case_inline": { "use_case_name": "Invoice totals", "system_prompt": "Extract vendor and total amount.", "fields": [ {"name": "vendor", "type": "str", "required": true}, {"name": "total", "type": "decimal"}, {"name": "currency", "type": "str", "choices": ["USD", "EUR", "CHF"]} ] }, // ...ix_client_id, request_id, context... } ``` When `use_case_inline` is set, the pipeline builds the response schema on the fly and skips the registry. Supported types: `str`, `int`, `float`, `decimal`, `date`, `datetime`, `bool`. `choices` is only allowed on `str` fields. Precedence: inline wins over `use_case` when both are present. Full REST surface + provenance response shape documented in the MVP design spec. ## Running locally ```bash uv sync --extra dev uv run pytest tests/unit -v # hermetic unit + integration suite IX_TEST_OLLAMA=1 uv run pytest tests/live -v # needs LAN access to Ollama + GPU ``` ## Deploying ```bash git push server main # rebuilds Docker image, restarts container, /healthz deploy gate python scripts/e2e_smoke.py # E2E acceptance against the live service ``` See [`docs/deployment.md`](docs/deployment.md) for full runbook + rollback.