Apple Live Captions — Designing a consistent accessibility system across Apple devices.

Deliverables

Concept development

Product Design

Accessibility

The brief

Live Captions exist across Apple devices, but inconsistent implementations create confusion for users who depend on them in critical moments.



Problem

Live Caption behavior varies across devices, increasing cognitive load and reducing trust in accessibility features.

Opportunity

Unify Live Captions into a consistent, system-level experience across Apple devices while respecting platform constraints.

How Can We Create a More Cohesive Apple Live Captions Experience?

Bento Grid showcasing Live captions in a whole new way with iPhone UI design showing text on screen.

Apple watch on user wrist showcasing Apple Live Captions

Live Icon showcasing a new accessibility icon.


Apple Live Captions interface design on iPhone and Watch

Apple Live Captions UI variations across devices

Apple Live Captions user needs mapping diagram

How can we bring Apple Intelligence into the existing products?

Surround Sound Dictation
with Apple Intelligence

As Apple products increasingly leverage intelligent features, I explored how Apple Live Captions can extend this intelligence to provide richer contextual information.


One inspiring example comes from Christine Sun Kim, who advocates for adding background noise descriptions to captions.


Since iOS 14, Apple devices can detect sounds like door knocks, running water, or baby cries. Extending this capability to environmental sounds in Live Captions could further enhance accessibility.

UI Design showcasing Apple Phone Up close with Apple Intelligence explaining what the enivorment is happening.

Apple Intelligence Helps
Manage Conversation Flow

What if Apple Intelligence could temporarily disable Live Captions when users are singing, creating more screen space?


Since iOS 14, FaceTime’s Automatic Prominence enlarges the tile of the person speaking or signing in group calls, highlighting who is taking the floor.

UI Design showcasing Apple Phone Up close with Apple Intelligence closing the Live Caption feature when person is singing.

Reflections

Growth Opportunities

Unifying complexity through simplicity

I balanced functional constraints with brand consistency across devices—designing for each platform’s rules while protecting a single, cohesive system. Long-term thinking helped unify the experience into one shared rhythm.

Designing from lived experience

Most products are built by teams with strong research—but without the lived experience to fully match the real quality of user needs. Designing from daily reliance—not just research—creates a deeper level of precision, empathy, and impact that can’t be simulated.

Next Project

Duolingo ASL - a non-verbal language learning app to help expand Duolingo market reach.