Skip to content

feat: As a user, I want to reference a secret in ai-proxy plugin, so that I don't have to write the apikey in plain text in plugin configuration #13132

@mikyll

Description

@mikyll

Description

Context

Currently, ai-proxy plugin supports auth in 3 ways:

  • auth.header - plain text value in plugin config
  • auth.query - plain text value in plugin config
  • auth.gcp - with JSON auth file with service account in plugin config or ENV var (GCP_SERVICE_ACCOUNT)

Goal/Use Case

I have a Kubernetes deployment (APISIX Ingress Controller) and Secret resources created/updated via an external secret vault service. This allows, for example, to rotate secret values without redeploying APISIX instances.

I'd like to avoid writing the API key value in my plugin's config and use, for example, a reference to the secret resource, similarly to how *-auth plugin works on Consumer, with authParameter (APISIX Ingress Controller Docs | ApisixConsumer).

Proposal

I'd like to discuss the following implementations and understand if and why they can (or cannot) be implemented:

  1. Support auth via environment variables reference, similarly to Consumer's *-auth plugins $ENV://$env_name/$sub_key.

    Example usage:

    routes:
      - id: ai_endpoint
        uri: /ai/chat/completions
        plugins:
          ai-proxy:
            provider: openai-compatible
            auth:
              header:
                Authorization: Bearer $ENV://MY_API_TOKEN
            # ...

    For reference: APISIX Docs | Secret - Use environment variables to manage secrets

  2. Support auth via some sort of secret reference, e.g. $secret.secret_id:

    routes:
      - id: ai_endpoint
        uri: /ai/chat/completions
        plugins:
          ai-proxy:
            provider: openai-compatible
            auth:
              header:
                Authorization: Bearer $secret.my_secret_id
            # ...
  3. Perform context variable resolution on auth parameters (query and header). The token could be fetched and injected via another plugin or something similar:

    routes:
      - id: ai_endpoint
        uri: /ai/chat/completions
        plugins:
          serverless-pre-function:
            phase: rewrite
            functions:
              - |
                return function(conf, ctx)
                  # Some logic to fetch the secret
                  ctx.var.my_secret = "abc123"
                end
          ai-proxy:
            provider: openai-compatible
            auth:
              header:
                Authorization: Bearer $my_secret
            # ...
  4. Other? Maybe consumer-based?

I also have another question: why do *-auth plugin support secret resolution via $ENV:// and $secret:// but others do not? Is there a specific design choice?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Status

    📋 Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions