Back to Release Notes

Critical Update: Prompt Configuration Fixes, OAuth 2.0 Improvements, and Enhanced Prompt Edit History

This release resolves critical issues in the API Manager, ensuring that prompt seed and temperature settings are correctly sent to LLM APIs, and OAuth 2.0 authentication now properly refreshes tokens if they’ve been invalidated externally. In the Console, users can now add optional custom headers to tool services for greater flexibility. A major new feature enables users to view, compare, and revert to previous versions of prompts using the prompt edit history display, improving prompt management and collaboration. These improvements enhance reliability, security, and usability across the platform for all users.

v25.9.4

Sep 04, 2025

API Manager

  • Fixed an issue where the seed and temperature values configured on prompts were not sent to the LLM APIs.

  • Fixed an issue when calling a tool using OAuth 2.0 authentication to refresh the token if the one currently stored was invalidated outside of the system.

Console

  • Added optional custom headers in tool services in Console.

  • Enabled the display of the prompt’s edit history–allowing users to view, compare and revert back to previous versions of the prompt.