We've all experienced itthat moment when an AI assistant creates more problems than it solves because of its robotic responses and lack of contextual awareness. It operates in a vacuum, making customer experiences less than delightful. In this session, you will learn how to create meaningful and hyper-personalized customer experiences with Amazon Nova Sonic and design patterns to enable new use cases. You will discover how Amazon Nova Sonic's tool use and function calling capabilities enable integration with external APIs and internal and external data sources to provide context that makes for delightful voice-based customer interactions.
What this session is about
Playbook
Editorial commentary · what to actually do about this on Monday
The concept
Voice AI with tool use and function calling. Integration with external APIs. Context-aware responses powered by Amazon Nova Sonic.
Why it matters
Voice is becoming a primary interaction channel for many use cases — accessibility, hands-free, regulated phone-based industries.
The hard parts
Latency. Voice tolerates ~300ms before feeling broken; agent tool calls often exceed that.
Playbook moves
(1) Optimise for first-token latency. (2) Stream when possible. (3) Pre-compute likely contexts.
The surprise
The most important "personalisation" in voice isn't preference matching — it's *interruption handling*. Real conversations have interruptions. Voice agents that handle them gracefully feel real; ones that don't feel robotic. Latency is necessary; interruption handling is sufficient.
---
Independent editorial perspective — not an official AWS or speaker statement. Designed for executives evaluating what to brief their teams on next.