All News

Apple Launches Live Translation for Messages FaceTime and Calls

Apple unveils Live Translation, an on-device AI feature enabling real-time translation in Messages, FaceTime, and Phone calls. This privacy-focused tool translates text as you type, offers live captions during FaceTime, and provides spoken translations in phone conversations—even with non-Apple users. Developers can easily integrate this via a new API.

Published June 9, 2025 at 02:12 PM EDT in Artificial Intelligence (AI)

At WWDC 2025, Apple introduced a groundbreaking feature called Live Translation, powered by Apple Intelligence. This new capability enables real-time translation across Messages, FaceTime, and Phone calls, transforming how users communicate across languages.

Unlike many cloud-based translation services, Apple’s Live Translation runs entirely on-device. Leslie Ikemoto, Apple’s director of input experience, emphasized that this approach keeps personal conversations private and secure, addressing growing concerns about data privacy.

In Messages, Live Translation automatically translates text as you type, delivering messages in your preferred language. When your contact replies, their texts are instantly translated back, creating a seamless multilingual chat experience.

FaceTime users benefit from live captions that translate spoken words in real time, making video calls more accessible and inclusive across language barriers.

Perhaps most impressively, Live Translation extends to phone calls. Whether you’re speaking to an Apple user or not, your words are translated on the fly and spoken aloud to the recipient. Their responses are similarly translated back to you, enabling natural conversations across languages without interruption.

For developers, Apple offers a new API that makes it easy to integrate Live Translation into communication apps. This opens exciting opportunities to build multilingual features that respect user privacy and run efficiently on-device.

While Apple has not yet disclosed the number of supported languages, this innovation marks a significant step forward in real-time, private, and accessible communication technology.

Why On-Device Translation Matters

Running translation models directly on the device enhances privacy by eliminating the need to send sensitive conversations to cloud servers. It also reduces latency, providing instant translations that keep conversations flowing naturally. This approach aligns with increasing user demand for data security without sacrificing functionality.

Opportunities for Developers and Businesses

The new API enables developers to embed live translation into their own communication platforms, expanding accessibility and user engagement globally. Businesses can leverage this to break down language barriers in customer support, remote collaboration, and social networking apps.

Imagine a customer service app that instantly translates calls and messages, allowing agents to assist clients worldwide without language constraints. Or a social app where friends chat effortlessly in their native tongues. Apple’s Live Translation paves the way for these possibilities.

Looking Ahead

As Apple continues to refine its AI models and expand language support, Live Translation could become a standard feature that redefines global communication. For users, it means conversations without borders. For developers, it’s a powerful tool to create inclusive, privacy-conscious apps.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte’s AI insights help developers leverage Apple’s new Live Translation API to build seamless multilingual communication apps. Discover how to integrate on-device AI for privacy-first translation features that enhance user engagement and global reach.