Documentation

NoPII is a PII-tokenizing reverse proxy for LLM APIs. Point your existing OpenAI or Anthropic SDK at NoPII and all PII is automatically detected, tokenized, and stripped before reaching the LLM provider. Responses are detokenized on the way back.

No SDK changes. No custom headers. Just change your base URL.

Quick Start

Get up and running in under 5 minutes. Register your API key, change one line of code, and you're protected.

How It Works

Understand the PII detection and tokenization flow, supported entity types, and deterministic tokenization.

API Reference

Full reference for the OpenAI and Anthropic proxy endpoints, including headers, request/response formats, and examples.

Sessions

Learn how sessions speed up multi-turn conversations by remembering token mappings across messages.

Streaming

How NoPII handles Server-Sent Events, buffering, and real-time token replacement in streaming responses.

Error Handling

NoPII's fail-safe error policy, status codes, and what happens when upstream services are unavailable.

GDPR Compliance

Right-to-erasure via token purge, automatic token expiration via configurable TTL, and audit log scrubbing.

Supported Providers

OpenAI, Anthropic, xAI, DeepSeek, Mistral, Gemini, Groq, Together, and Fireworks — all supported out of the box.

PII Configuration

Configure entity types, confidence thresholds, credential detection, and context phrase neutralization.

Team Management

Role-based access control with Owner, Admin, and Member roles. Invite team members and manage permissions.

Langfuse Integration

LLM observability with trace spans for each stage of the NoPII pipeline. W3C distributed tracing support.

Billing & Pricing

Free tier with 1M tokens/month. Pro at $50/month with 50M tokens included, then $1 per million after.

Examples

Working code samples for Python, Node.js, and other popular frameworks. Clone and run in minutes.