Request Tracing
Request Tracing is a powerful Pro feature that allows you to record the full content of requests and responses made to AI providers. This detailed logging gives you comprehensive visibility into how your users and applications interact with AI models.
Request Tracing is available exclusively on the Pro plan. It can be enabled at both the policy and user level.
Benefits of Request Tracing
Request Tracing provides several key advantages:
- Debugging: Troubleshoot issues by seeing exactly what was sent and received
- Auditing: Maintain a record of all AI interactions for compliance or verification
- Training Data: Build datasets for fine-tuning your own models in the future
- Quality Assurance: Evaluate the quality of responses and identify improvement opportunities
- Compliance: Monitor content for policy violations or inappropriate usage
Enabling Request Tracing
Navigate to Policies
Go to the “Policies” section in the sidebar.
Edit a Policy
Click the pencil icon on the policy where you want to enable tracing.
Enable Tracing
Toggle “Request Tracing” to the on position.
Request Tracing must be enabled before Cache Requests can be enabled, as caching relies on the tracing infrastructure.
Save Changes
Click “Update Policy” to save your changes. All new requests using this policy will now be traced.
Enabling Tracing for a Single User
You can also enable tracing for a specific user:
- Navigate to “Users” and select the user
- Click “Create Policy” in the User Policy section
- Toggle “Request Tracing” on
- Set an expiration date if this is a temporary change
- Click “Create Policy” to save
This is useful when you need to debug issues for a specific user without enabling tracing for everyone.
Viewing Traced Requests
Once tracing is enabled, you can view the recorded data:
- Navigate to “Usage” in the sidebar
- Find the request you want to examine
- Click on the request to open the details
- View the complete request and response information:
- Request URL and headers
- Request body/prompt
- Response body/completion
- Timing information
- Token counts and cost data
Privacy and Security Considerations
When using Request Tracing, be mindful of privacy and security:
- User Consent: Inform users that their interactions with AI may be recorded
- Sensitive Data: Avoid enabling tracing for workflows that process highly sensitive information
- Data Retention: Consider how long you need to retain traced data
- Access Control: Limit access to traced data to authorized personnel
Request Tracing captures the full content of prompts and responses. Be sure your use of this feature complies with your privacy policy and applicable regulations.
Building Training Datasets
One powerful use of Request Tracing is to build datasets for fine-tuning custom models:
- Enable tracing for a policy used by typical users
- Allow usage over time to collect diverse examples
- Export the data (contact support for assistance)
- Clean and format the data for fine-tuning
- Use the dataset to create specialized models that perform better at your specific tasks
This approach can significantly reduce costs in the long run by allowing you to train smaller, more efficient models that achieve similar results for your specific use cases.
Combining with Request Caching
Request Tracing works seamlessly with Request Caching:
- Enable both features on a policy
- Monitor which requests are frequently repeated
- Use these insights to optimize your application design
- Let caching automatically improve performance for common requests
Next Steps
After enabling Request Tracing:
- Explore Request Caching to improve performance
- Learn how to track usage across your project
- Consider building training datasets based on the traced data