Currently, only LLM models that expose a REST API endpoint are supported to run tests through the web UI.
Key Features
- LLM Agnostic: Works with any LLM that exposes a REST API endpoint
- Web UI Integration: Run and manage all your tests through an intuitive web interface
Deployment Recommendations
TrustTest is optimized to evaluate LLM models that are not running in the same process. For best performance and reliability, we recommend:- Running TrustTest in a separate Python process from your LLM or in a different server.
- Using the Http target interface to connect to your LLM’s API endpoint