Back to Release Notes
Critical Update: Prompt Configuration Fixes, OAuth 2.0 Improvements, and Enhanced Prompt Edit History
This release resolves critical issues in the API Manager, ensuring that prompt seed and temperature settings are correctly sent to LLM APIs, and OAuth 2.0 authentication now properly refreshes tokens if they’ve been invalidated externally. In the Console, users can now add optional custom headers to tool services for greater flexibility. A major new feature enables users to view, compare, and revert to previous versions of prompts using the prompt edit history display, improving prompt management and collaboration. These improvements enhance reliability, security, and usability across the platform for all users.