# Keeptrusts > Keeptrusts is an AI environment for organizations. It sits between applications and LLM providers to enforce policy, route traffic, improve output quality, and keep a complete request history — all in one place. Keeptrusts is built for organizations that need a consistent, auditable AI environment across providers and models. It is not a model or AI provider itself; it is the infrastructure layer that connects, governs, and observes AI traffic. ## What Keeptrusts Does - **Policy enforcement**: Define routing, data-handling, spend, and output rules once. They apply across every AI request regardless of provider or team. - **Multi-provider routing**: Point traffic at any combination of 50+ hosted or self-managed LLM providers — OpenAI, Anthropic, Azure OpenAI, AWS Bedrock, Google Gemini, Mistral, Cohere, local models, and more — without rewriting application code. - **Output quality management**: Apply real-time guardrails, redaction, and output shaping inline before responses reach end users. - **Audit trail**: Every request and response passes through a permanent, queryable history. Useful for compliance, debugging, and cost attribution. - **Escalation workflows**: Route sensitive or uncertain outputs to human review rather than returning them automatically. - **Flexible deployment**: Runs in cloud (SaaS), on-premises, Kubernetes clusters, or air-gapped environments. ## Who Uses Keeptrusts - **Engineering teams** building AI-powered products who need a shared infrastructure layer instead of rebuilding routing and guardrails in every service. - **Compliance and legal teams** who need evidence of AI behavior for audits, GDPR, HIPAA, EU AI Act, or internal governance. - **Operations teams** who need spend visibility and control across multiple AI providers and business units. - **Security teams** who need prompt-injection defenses, PII redaction, and tenant isolation across LLM workloads. ## Key Concepts **AI Environment**: A shared infrastructure layer that governs, routes, and observes AI traffic. Analogous to how an API gateway works for traditional services, but purpose-built for LLM requests. **Policy**: A declarative configuration that defines what Keeptrusts should do with a request — block it, redact data in it, route it to a specific provider, add a disclaimer to the response, escalate it, or log it. **Gateway**: The runtime component (the Keeptrusts gateway binary) that intercepts requests on behalf of applications and applies the policy chain before forwarding to upstream LLM providers. **Trace**: The full lifecycle record for a single AI request — input, output, provider used, policy decisions applied, latency, and token cost. **Escalation**: A workflow where specific requests or outputs are routed to a human reviewer instead of being returned automatically. ## Supported Providers OpenAI (GPT-4, GPT-4o, o1, o3), Anthropic (Claude), Azure OpenAI, AWS Bedrock, Google Gemini, Mistral, Cohere, Llama (via Ollama or hosted endpoints), GitHub Models, and 40+ additional hosted and self-managed endpoints. ## Deployment Options - **Cloud (SaaS)**: Managed hosting by Keeptrusts — fastest to start, zero infrastructure. - **Self-hosted**: Deploy the API control plane and gateway binary in your own cloud or data center. - **Air-gapped**: Full on-premises operation with no outbound internet dependency. ## Regulatory and Compliance Profiles Keeptrusts ships with pre-built policy templates for: - EU AI Act (risk classification, human oversight, logging requirements) - HIPAA (PHI redaction, audit trails, minimum-necessary controls) - GDPR (data minimization, right-to-erasure flows, cross-border data restrictions) - Defense and government (content classification, chain-of-custody, data sovereignty) - Financial services (FINRA/SEC, advice disclaimers, transaction-data restrictions) ## Contact - **Sales and evaluations**: sales@keeptrusts.com - **Security disclosures**: security@keeptrusts.com - **General**: contact@keeptrusts.com - **Website**: https://www.keeptrusts.com ## Documentation - Full product documentation: https://docs.keeptrusts.com - User quickstart: https://docs.keeptrusts.com/quickstart ## Optional: Extended Content For full, structured content optimized for LLM consumption, see: https://www.keeptrusts.com/llms-full.txt