On-Device Processing Explained by Techspert

If you've ever picked up a new Pixel phone, you've probably heard about "on-device processing" being the magic behind its cool new features. Take the Pixel 9 phones, for example—features like Pixel Studio and Call Notes run "on device." And it's not just phones; Nest cameras, Pixel smartwatches, and Fitbit devices are all in on this "on-device processing" action. Given how widespread it is and the features it powers, it's clearly a big deal.
So, what exactly does "on-device processing" mean? Well, it's pretty much what it sounds like—the processing happens right on the device. But to dive deeper, we chatted with Trystan Upstill, a Google veteran of nearly 20 years who's worked on engineering teams across Android, Google News, and Search.
Trystan, you were part of the team that developed some of the exciting features for the new Pixel devices. What did you work on?
Recently, I led a team within Android that focused on blending Google's various tech stacks into an amazing user experience. It's all about figuring out how to build and ship these features.
With technology constantly improving and new features being introduced, it must feel like a never-ending job.
Absolutely! The past few years have seen a huge leap in generative AI capabilities. At first, the idea of running large language models on devices seemed like a distant dream—maybe something for 2026. But technology evolved so quickly that we were able to launch features using Gemini Nano, our on-device model, on the Pixel 8 Pro in December 2023.
Let's break down "on-device processing." What does "processing" actually mean?
The main processor, or system-on-a-chip (SoC), in your devices, has several Processing Units designed specifically for different tasks. That's why you'll hear about chips like the Tensor chip in Pixels being called a "system-on-a-chip." It's not just one processor but a bunch of them, along with memory, interfaces, and more, all on one piece of silicon.
Let's use Pixel smartphones as an example: You've got the Central Processing Unit (CPU) as the main "engine"; the Graphics Processing Unit (GPU) for rendering visuals; and now, the Tensor Processing Unit (TPU), which Google designed specifically to handle AI/ML workloads on the device. They all work together to get things done—aka, processing.
For instance, when you snap photos, you're using all these processing powers. The CPU runs core tasks, the GPU helps render what the lens sees, and on a premium Android device like a Pixel, the TPU works to enhance your photos.
Got it. "On-device" processing implies there's off-device processing. Where does that happen?
Off-device processing happens in the cloud. Your device sends a request over the internet to servers, which perform the task and send the results back to your phone. To make this happen on-device, we take the large machine learning model from the cloud, make it smaller and more efficient, and run it on your device's operating system and hardware.
What hardware makes this possible?
New, more powerful chipsets. For example, the Pixel 9 Pro uses our SoC called Tensor G4, which enables these phones to run models like Gemini Nano and handle high-performance computations.
So, Tensor is designed specifically to run Google AI, which powers a lot of Pixel's new gen AI capabilities.
Exactly! And while generative AI features are a big part of it, on-device processing also makes possible things like rendering video, playing games, HDR photo editing, and language translation—pretty much everything you do with your phone. These tasks happen on your phone, not on a server.
TalkBack with Gemini, which analyzes images and reads descriptions out loud to blind or low-vision users, is an example of on-device processing that uses Tensor, Pixel’s system on a chip.
The computation power of today's smartphones is incredible. They're thousands of times faster than early high-performance computers, which used to fill entire rooms. Those old computers were the cutting edge for data analysis, image processing, anomaly detection, and early AI research. Now, we can do all this on our devices, opening up new opportunities for helpful features.
Is on-device processing better than off-device?
Not necessarily. If you used Search entirely on-device, it would be slow or limited because searching the web is like finding a needle in a haystack. The entire web index can't fit on your phone! Instead, Search uses the cloud and our data centers to access trillions of web pages.
But for more specific tasks, on-device processing is really useful. It's faster because there's no latency, and it works without an internet connection, making it more reliable. Plus, since the AI chip is in your pocket, apps can use these LLM capabilities for free.
Both have their advantages: The cloud has more powerful models and can store lots of data, like your photos and videos. It also supports actions like searching massive databases, like Drive, Gmail, and Google Photos.
I'm already impressed with what my Pixel can do, but it sounds like it's only going to get better.
Yes, the models we use for these complex tasks on Android devices are getting more capable. But it's not just about better technology; we also focus on what will actually benefit people. We don't just introduce products because the on-device processing can handle it; we want to make sure it's something people want to use in their everyday lives.
Related article
Meta Enhances AI Security with Advanced Llama Tools
Meta has released new Llama security tools to bolster AI development and protect against emerging threats.These upgraded Llama AI model security tools are paired with Meta’s new resources to empower c
NotebookLM Unveils Curated Notebooks from Top Publications and Experts
Google is enhancing its AI-driven research and note-taking tool, NotebookLM, to serve as a comprehensive knowledge hub. On Monday, the company introduced a curated collection of notebooks from promine
Alibaba Unveils Wan2.1-VACE: Open-Source AI Video Solution
Alibaba has introduced Wan2.1-VACE, an open-source AI model poised to transform video creation and editing processes.VACE is a key component of Alibaba’s Wan2.1 video AI model family, with the company
Comments (26)
0/200
MichaelMartínez
July 31, 2025 at 7:35:39 AM EDT
Loved the article on on-device processing! It's wild how much power is packed into my Pixel 9 for stuff like Pixel Studio. Makes me wonder if this is the future of all smart devices or just a Google flex. 😎
0
GeorgeLopez
April 17, 2025 at 4:40:32 PM EDT
On-Device Processing Explained by Techspert is a lifesaver for tech newbies like me! It breaks down the magic behind Pixel phones in such a simple way. I finally get why my Pixel 9 runs so smoothly. Only wish it had more examples for other devices too! 😅
0
HenryTurner
April 15, 2025 at 1:33:50 AM EDT
On-Device Processing Explained by Techspert é um salva-vidas para iniciantes em tecnologia como eu! Explica de maneira tão simples a mágica por trás dos telefones Pixel. Finalmente entendi porque meu Pixel 9 funciona tão bem. Só gostaria que tivesse mais exemplos para outros dispositivos também! 😅
0
ChristopherTaylor
April 14, 2025 at 12:35:12 AM EDT
On-Device Processing Explained by Techspert es un salvavidas para novatos en tecnología como yo. Explica de manera tan sencilla la magia detrás de los teléfonos Pixel. Finalmente entiendo por qué mi Pixel 9 funciona tan suavemente. ¡Solo desearía que tuviera más ejemplos para otros dispositivos también! 😅
0
HaroldMoore
April 13, 2025 at 9:52:12 AM EDT
Techspertの「On-Device Processing Explained」は、ピクセルフォンの詳細について知るには良いですが、少し専門的すぎます。もう少し分かりやすく説明してくれたら、もっと興味を持てたかもしれません。それでも、技術に詳しい人には役立つと思います。
0
JeffreyThomas
April 13, 2025 at 6:44:38 AM EDT
On-Device Processing Explained by Techspert es súper útil si te interesa la tecnología. Explica cómo funcionan las características del Pixel 9 como Pixel Studio sin necesidad de internet. Es un poco técnico pero vale la pena si quieres entender mejor tu dispositivo. Podrían usar términos más sencillos, ¿no crees? 🤓
0


Loved the article on on-device processing! It's wild how much power is packed into my Pixel 9 for stuff like Pixel Studio. Makes me wonder if this is the future of all smart devices or just a Google flex. 😎




On-Device Processing Explained by Techspert is a lifesaver for tech newbies like me! It breaks down the magic behind Pixel phones in such a simple way. I finally get why my Pixel 9 runs so smoothly. Only wish it had more examples for other devices too! 😅




On-Device Processing Explained by Techspert é um salva-vidas para iniciantes em tecnologia como eu! Explica de maneira tão simples a mágica por trás dos telefones Pixel. Finalmente entendi porque meu Pixel 9 funciona tão bem. Só gostaria que tivesse mais exemplos para outros dispositivos também! 😅




On-Device Processing Explained by Techspert es un salvavidas para novatos en tecnología como yo. Explica de manera tan sencilla la magia detrás de los teléfonos Pixel. Finalmente entiendo por qué mi Pixel 9 funciona tan suavemente. ¡Solo desearía que tuviera más ejemplos para otros dispositivos también! 😅




Techspertの「On-Device Processing Explained」は、ピクセルフォンの詳細について知るには良いですが、少し専門的すぎます。もう少し分かりやすく説明してくれたら、もっと興味を持てたかもしれません。それでも、技術に詳しい人には役立つと思います。




On-Device Processing Explained by Techspert es súper útil si te interesa la tecnología. Explica cómo funcionan las características del Pixel 9 como Pixel Studio sin necesidad de internet. Es un poco técnico pero vale la pena si quieres entender mejor tu dispositivo. Podrían usar términos más sencillos, ¿no crees? 🤓












