Upgrades and Change Management
This chapter describes how platform operators should manage change safely across tenants, including: - platform software upgrades, - AI Engine changes, - global Custom Field / AI Task updates, - prompt and schema changes.
Because AI outputs can change with prompts/models, change management is not only technical—it affects customer reporting and trust.
Change types and risk levels
Platform software changes
- UI and workflow changes
- performance and scaling changes
- schema migrations
AI Engine changes
- model upgrades (new versions)
- provider changes
- parameter changes (timeouts, token limits)
AI configuration changes
- global AI Task prompt edits
- output schema changes (JSON keys/types)
- mapping changes
- new/removed Custom Fields
- default filter changes
Recommended change process
- Plan
- identify impacted tenants and tasks
- classify change as safe vs breaking
- Test
- run regression tests on a transcript test suite
- verify JSON compliance and schema validation
- measure token usage and latency changes
- Stage
- deploy to staging environment
- run pilot tenants first
- Roll out
- gradual rollout
- monitor errors, cost, and output shifts
- Communicate
- release notes for tenant admins (what changed and why)
- highlight expected metric shifts
- Rollback
- have a rollback plan (revert engine/task/prompt; disable feature)
Versioning strategy (recommended)
For global AI Tasks
Prefer a versioned task approach:
- create CSAT Scoring v2 rather than editing the existing prompt in place when logic changes significantly.
- allow tenants to migrate when ready.
- deprecate v1 after a transition period.
For Custom Fields
Treat type and computer name as immutable. For breaking changes: - create a new field - migrate mapping in a new task version
Tenant overrides and drift
Tenant overrides (prompt/filter) create divergence.
Recommended operator practices: - maintain an “Overrides inventory” (who overrides what) - when shipping global updates, document: - which tenants are affected directly (no overrides) - which tenants are not affected (overridden tasks) - provide a workflow to “reset to defaults” if supported
Communicating AI changes to customers (recommended template)
Include: - what changed (prompt/model/schema) - why it changed (accuracy, consistency, performance) - expected effect (score distribution may shift) - what customers should do (recalibrate thresholds, revalidate) - rollback/opt-out option (if available)
Implementation notes
- Tenants automatically receive global task updates unless they have applied overrides
- Tenant admins can override prompt and filter settings; they can also revert to defaults
- New tasks and changes apply only to new conversations by default; backfill requires manual coordination
- Adopt "new version = new task" for any change that can shift metrics significantly
- Maintain a partner-facing changelog and tenant-admin facing release notes