DwiziDwizi

Browse docs

Enterprise

Security, compliance, and governance features.

Enterprise Capabilities

Dwizi is designed to fit into enterprise environments where security, compliance, and governance are prerequisites for software adoption. This guide details the technical features that enable safe AI tool execution at scale.

Security Architecture

Container Isolation

The core of Dwizi's security model is strict container isolation.

  • Per-Run Isolation: Every single tool execution spawns a new, ephemeral container.
  • No Shared State: Containers share no file system or memory with the host or other containers.
  • Resource Constraints: CPU and memory limits are enforced at the kernel level (cgroups), preventing any single tool from degrading system performance.
  • Network Policies: By default, tools have no outbound internet access. Enterprise deployments can enable allowlisted egress or private routing for approved services.

Secrets Management

  • Encryption: Environment variables (API keys, database credentials) are encrypted at rest.
  • Runtime Injection: Secrets are injected into the container environment only at runtime and are never exposed in the client-side code editor or logs.
  • Scope: Secrets are scoped to the specific tool, ensuring that a compromise of one tool does not leak credentials for another.

Governance & Compliance

Role-Based Access Control (RBAC)

Dwizi provides a granular permission system to manage team access.

  • Owners: Full administrative control, including billing and member management.
  • Admins: Can manage tools and settings but cannot delete the organization.
  • Members: Can create and modify tools.
  • Viewers: Read-only access to tool source code and retained execution logs (if enabled).

Audit Trails

Policy actions and access decisions are logged and retrievable.

  • Audit Events: Timestamp, duration, user ID, and exit status for every execution.
  • Execution Logs (Optional): Stdout/stderr capture when output/log retention is enabled.
  • Retention Controls: Output and log retention is configurable and can be disabled for compliance requirements.

See Audit Events and Data Retention for details.

Code Provenance

Since every tool is versioned, you can trace exactly which version of the code was executed at any point in time. This is critical for debugging and regulatory compliance.

Integration Patterns

Private Networking

For self-hosted or VPC-peered deployments, Dwizi can run within your private network.

  • Internal APIs: Securely access internal microservices or databases without exposing them to the public internet.
  • VPN/Direct Connect: Route traffic through your existing secure channels.

Identity Management

Note: SSO is available on Enterprise plans.

  • SAML/OIDC: Integrate with your existing Identity Provider (Okta, Azure AD, Auth0) to manage user access centrally.
  • Automated Provisioning: Support for SCIM to automatically onboard and offboard team members.

Operational Reliability

High Availability

  • Stateless Gateway: The API gateway is stateless and can be horizontally scaled to handle any request volume.
  • Warm Pools: We maintain a pool of pre-warmed containers to ensure low latency for latency-sensitive AI applications.

Rate Limiting & Quotas

Protect your downstream services and manage costs.

  • Global Limits: Set a hard cap on the number of concurrent executions per organization.
  • Tool Limits: Restrict specific tools to a defined request rate (e.g., 5 runs/minute) to prevent API quota exhaustion on external services.

Use Cases

Secure Data Processing

Process sensitive customer data (PII) without it leaving your controlled environment.

  • Example: An AI agent that summarizes support tickets. The tool fetches the ticket from your internal CRM, redacts sensitive info locally, and only returns the summary to the LLM.

Automated Remediation

Give AI agents the ability to fix production issues safely.

  • Example: A tool that restarts a specific service or clears a cache. By wrapping this logic in a Dwizi tool, you define the exact boundaries of what the AI can do (e.g., "restart service X" but not "shutdown server Y").

Internal Knowledge Retrieval

Connect LLMs to your internal documentation or wiki.

  • Example: A tool that searches your internal Confluence or Notion instance. The tool handles the authentication and search, returning only relevant snippets to the model.