Address user confusion from "pending" jobs with no explanation: * FileRef.display_name — optional UI-only metadata so the UI can show the client-provided upload filename instead of the on-disk UUID. The pipeline ignores it for execution; older stored rows stay valid because the field is optional. * jobs_repo.queue_position — returns (ahead, total_active) so the fragment can render "Queue position: N ahead" for pending jobs and "About to start" when N == 0. Terminal jobs return (0, 0). * Fragment polish — status / queue / elapsed / result panels; live dot via CSS animation; CPU-mode details block when /healthz reports ocr_gpu: false. Elapsed time formats as MM:SS (running or finished). * /healthz gains an additive ocr_gpu key (true/false/null). Existing postgres/ollama/ocr gating is unchanged. Surya records CUDA availability on warm_up; FakeOCRClient has no attribute and probes to None. * Persistent header with "Upload a new extraction" link and copy-to- clipboard button for the current job id; <title> includes job id. Tests: +7 unit + +6 integration. Full suite 321 green (272 unit, 49 integration). Lint clean. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com> |
||
|---|---|---|
| .forgejo/workflows | ||
| alembic | ||
| docs | ||
| scripts | ||
| src/ix | ||
| tests | ||
| .env.example | ||
| .gitignore | ||
| .python-version | ||
| AGENTS.md | ||
| alembic.ini | ||
| docker-compose.yml | ||
| Dockerfile | ||
| pyproject.toml | ||
| README.md | ||
| uv.lock | ||
InfoXtractor (ix)
Async, on-prem, LLM-powered structured information extraction microservice.
Given a document (PDF, image, text) and a named use case, ix returns a structured JSON result whose shape matches the use-case schema — together with per-field provenance (OCR segment IDs, bounding boxes, cross-OCR agreement flags) that let the caller decide how much to trust each extracted value.
Status: MVP deployed. Live on the home LAN at http://192.168.68.42:8994 (REST API + browser UI at /ui).
Web UI
A minimal browser UI lives at http://192.168.68.42:8994/ui: drop a PDF, pick a registered use case or define one inline, submit, see the pretty-printed result. HTMX polls the job status every 2 s until the pipeline finishes. LAN-only, no auth.
- Full reference spec:
docs/spec-core-pipeline.md(aspirational; MVP is a strict subset) - MVP design:
docs/superpowers/specs/2026-04-18-ix-mvp-design.md - Implementation plan:
docs/superpowers/plans/2026-04-18-ix-mvp-implementation.md - Deployment runbook:
docs/deployment.md - Agent / development notes:
AGENTS.md
Principles
- On-prem always. LLM = Ollama, OCR = local engines (Surya first). No OpenAI / Anthropic / Azure / AWS / cloud.
- Grounded extraction, not DB truth. ix returns best-effort fields + provenance; the caller decides what to trust.
- Transport-agnostic pipeline core. REST + Postgres-queue adapters in parallel on one job store.
Submitting a job
curl -X POST http://192.168.68.42:8994/jobs \
-H "Content-Type: application/json" \
-d '{
"use_case": "bank_statement_header",
"ix_client_id": "mammon",
"request_id": "some-correlation-id",
"context": {
"files": [{
"url": "http://paperless.local/api/documents/42/download/",
"headers": {"Authorization": "Token …"}
}],
"texts": ["<Paperless Tesseract OCR content>"]
}
}'
# → {"job_id":"…","ix_id":"…","status":"pending"}
Poll GET /jobs/{job_id} until status is done or error. Optionally pass callback_url to receive a webhook on completion (one-shot, no retry; polling stays authoritative).
Ad-hoc use cases
For one-offs where a registered use case doesn't exist yet, ship the schema inline:
{
"use_case": "adhoc-invoice", // free-form label (logs/metrics only)
"use_case_inline": {
"use_case_name": "Invoice totals",
"system_prompt": "Extract vendor and total amount.",
"fields": [
{"name": "vendor", "type": "str", "required": true},
{"name": "total", "type": "decimal"},
{"name": "currency", "type": "str", "choices": ["USD", "EUR", "CHF"]}
]
},
// ...ix_client_id, request_id, context...
}
When use_case_inline is set, the pipeline builds the response schema on the fly and skips the registry. Supported types: str, int, float, decimal, date, datetime, bool. choices is only allowed on str fields. Precedence: inline wins over use_case when both are present.
Full REST surface + provenance response shape documented in the MVP design spec.
Running locally
uv sync --extra dev
uv run pytest tests/unit -v # hermetic unit + integration suite
IX_TEST_OLLAMA=1 uv run pytest tests/live -v # needs LAN access to Ollama + GPU
UI queue + progress UX
The /ui job page polls GET /ui/jobs/{id}/fragment every 2 s and surfaces:
- Queue position while pending: "Queue position: N ahead — M jobs total in flight (single worker)" so it's obvious a new submission is waiting on an earlier job rather than stuck. "About to start" when the worker has just freed up.
- Elapsed time while running ("Running for MM:SS") and on finish ("Finished in MM:SS").
- Original filename — the UI stashes the client-provided upload name in
FileRef.display_nameso the browser showsyour_statement.pdfinstead of the on-disk UUID. - CPU-mode notice when
/healthzreportsocr_gpu: false(the Surya OCR client observedtorch.cuda.is_available() == False): a collapsed<details>pointing at the deployment runbook.
Deploying
git push server main # rebuilds Docker image, restarts container, /healthz deploy gate
python scripts/e2e_smoke.py # E2E acceptance against the live service
See docs/deployment.md for full runbook + rollback.