capsule-ocr is the open source OCR engine powering photrap.io. Self-host it, build on it, or call our hosted endpoint for $0.002 per request via L402 Lightning micropayment — no account, no API key, no billing dashboard.
The TRAP runtime and capsule-ocr are open source. Run them on your own hardware — homelab, VPS, or cloud. No telemetry, no call-home.
Don't want to run infrastructure? Use our hosted endpoint. Every call is metered via L402 Lightning micropayments — pay exactly what you use, settle instantly, no signup.
| Method | Endpoint | Description | Price |
|---|---|---|---|
| GET | /v1/health | Service health check | free |
| POST | /v1/ocr/suggest | OCR image → text + ranked suggestions (title, author, provider, URL) | $0.002 |
| POST | /v1/qr/decode | Decode QR code from image → URL or payload | $0.001 |
| POST | /v1/trap/resolve | Resolve media URL → platform + resolver chain JSON | $0.001 |
| POST | /v1/trap/full | Full pipeline: image → QR/OCR → resolve → play link | $0.004 |
curl, Python, JavaScript, or wire it directly into your AI agent. The L402 flow is handled by our client library or any standard Lightning wallet.
# Step 1: attempt call — receive L402 invoice curl -X POST https://api.photrap.io/v1/trap/full \ -F "file=@screenshot.png" \ -w "%{http_code}" # → 402 Payment Required # → WWW-Authenticate: L402 macaroon="...", invoice="lnbc20..." # Step 2: pay invoice with any Lightning wallet, get preimage # Step 3: retry with credentials curl -X POST https://api.photrap.io/v1/trap/full \ -H "Authorization: L402 <macaroon>:<preimage>" \ -F "file=@screenshot.png" # → { "play_link": "https://photrap.io/p/ab3x9z2m", # "platform": "Spotify", "type": "track", # "resolver_chain": [...] }
from trap_client import TRAPClient # Client handles L402 handshake automatically # Uses your Lightning node or Alby wallet client = TRAPClient( lightning_node="https://your-alby-or-lnd", budget_sats=100 # auto-pay up to 100 sats per session ) # OCR a screenshot result = client.ocr("screenshot.png", profile="books") # → { text, suggestions: [{type, value, confidence}] } # Full pipeline: image → play link link = client.trap("now_playing.png") # → "https://photrap.io/p/ab3x9z2m" # Resolve a URL to platform info chain = client.resolve("https://open.spotify.com/track/4iV5...") # → { platform, type, resolver_chain, play_link }
import { TRAPClient } from '@photrap/trap-client' const client = new TRAPClient({ wallet: 'https://api.getalby.com/payments', budgetSats: 50 }) // Screenshot → play link const { playLink, platform, resolverChain } = await client.trap(imageBlob) console.log(playLink) // "https://photrap.io/p/ab3x9z2m" // Or just OCR const { text, suggestions } = await client.ocr(imageBlob, { profile: 'media' }) // Each call auto-handles L402: // 402 received → pay invoice → retry with preimage
// Give Claude (or any LLM) the TRAP endpoint as a tool const tools = [{ name: "trap_media", description: "Convert a media URL or image screenshot into a platform-agnostic play link that works across Spotify, YouTube, Plex, Apple Music, and 30+ other platforms.", input_schema: { type: "object", properties: { url: { type: "string", description: "Media URL to resolve" }, image: { type: "string", description: "Base64 image (screenshot/QR)" } } } }] // When Claude calls this tool, your handler: async function trap_media({ url, image }) { return await client.trap(url || image) // Cost: ~$0.002 deducted from your L402 budget // Claude gets back the play link + resolver chain }
AI agents, home assistants, and media tools calling the TRAP API. Each call costs $0.002. Each solves a real problem.
Self-host for free. Pay per call. Or get a license for the full stack.