Skip to main content

Google Podcasts already transcribing episodes to improve search results

With the official launch of Google Podcasts last June, the company promised AI features like translation and automatic transcription. Earlier this year, we enabled the latter feature on Android so that users could read a podcast. While not entirely user-facing, this functionality is already working in the background for search.

Last April, Google detailed its podcast plans in-depth, with artificial intelligence and machine learning playing a key role. In transcribing podcasts, Google could understand what an episode is about without having to solely rely on podcaster-generated show notes and descriptions.

This technology exists with Google Cloud Speech-to-Text, and is already rated for transcribing four or more speakers with background noise for over two hours. As spotted by Android Police, this functionality appears to already be in use.

On the recently discovered web experience, it’s possible to confirm that Google Podcasts is transcribing episodes. This feature is not user-facing, unlike the Closed Captioning button we spotted on Android in January. Rather, it is only accessible with developer tools in the page source.

This transcription appears to be in use for improving search results. The built-in search feature — which just added episode lookup — can find terms that were only transcribed, but not included anywhere else in show notes or podcast details.

It’s not yet clear if Google is leveraging transcription to offer other, more advanced features like understanding the topics in a podcast, rather than just trying to find word-for-word matches.

Transcription can also allow for timestamps and indexing, with creators no longer having to manually add chapters or being able to jump to an exact section from a search result. Using Text-to-Speech, Google could translate episodes and make them globally available. The Android app is already working on the ability to let users read a podcast as they listen, which is useful in noisy environments.

Of course, given that Google’s money-maker is still advertising, transcription could allow the company to better serve ads by knowing what you have just listened to on an episode. Ads could hypothetically appear in the Podcasts player or follow users around on the internet, like today.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Google on YouTube for more news:

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications