This guide demonstrates how to enable TrustLens metrics in TrustGate to monitor AI gateway requests.

Prerequisites

  • TrustGate installed and running
  • Access to TrustLens Token
  • Basic understanding of rate limiting concepts

Create a Gateway with TrustLens Telemetry

Create a gateway with the telemetry configuration for TrustLens:

curl -X POST "http://localhost:8080/api/v1/gateways" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "OpenAI Token Rate Limiter",
    "subdomain": "your-subdomain",
    "telemetry": {
        "config": [
            {
                "name": "trustlens",
                "settings": {
                    "url": "https://data.neuraltrust.ai/v1/trace",
                    "token": "YOUR_TOKEN"
                }
            }
        ]
    }
  }'

Configuration Parameters

Below are the key configuration parameters used to enable TrustLens telemetry:

ParameterTypeDescription
namestringMust be set to "trustlens" to enable the TrustLens telemetry provider.
urlstringThe endpoint where telemetry data will be sent. Use: https://data.neuraltrust.ai/v1/trace.
tokenstringAPI token used to authenticate with TrustLens. Required for successful transmission.

Best Practices

  • Use a secure token management system (e.g., environment variables or secret managers) to store your TrustLens API token.
  • Monitor the TrustLens dashboard to validate incoming telemetry.
  • Ensure outbound access to https://data.neuraltrust.ai/v1/trace is not blocked by firewalls or network policies.

Troubleshooting

  • No data in TrustLens dashboard: Confirm that the gateway is sending telemetry and the token is correct.
  • Network errors: Check your firewall rules and network connectivity to the TrustLens endpoint.
  • 401 Unauthorized: The provided token may be invalid or expired.