{"id":19404,"date":"2024-10-14T18:14:45","date_gmt":"2024-10-14T18:14:45","guid":{"rendered":"http:\/\/scannn.com\/google\/what-is-on-device-processing-a-google-engineer-explains\/"},"modified":"2024-10-14T18:14:45","modified_gmt":"2024-10-14T18:14:45","slug":"what-is-on-device-processing-a-google-engineer-explains","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/what-is-on-device-processing-a-google-engineer-explains\/","title":{"rendered":"What is on-device processing? A Google engineer explains"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p data-block-key=\"51m6r\">Every time a new Pixel phone comes out, you might hear that \u201con-device processing\u201d makes its cool new features possible. Just take a look at the new Pixel 9 phones \u2014 things like Pixel Studio and Call Notes run \u201con device.\u201d And it\u2019s not just phones: Nest cameras, Pixel smartwatches and Fitbit devices also use this whole \u201con-device processing\u201d thing. Given the devices that use it and the features it\u2019s powering, it sounds pretty important.<\/p>\n<p data-block-key=\"6tt8g\">It\u2019s safe to assume that the, er, processing, is happening on the, uh\u2026well, the device. But to get a better understanding of what that means, we talked to Trystan Upstill, who has been at Google for nearly 20 years working on engineering teams across Android, Google News and Search.<\/p>\n<p data-block-key=\"6647u\"><b>You were on a team that helped develop some of the exciting features that shipped with our new Pixel devices \u2014 can you tell me a little about what you worked on?<\/b><\/p>\n<p data-block-key=\"4f6pq\">Most recently, I worked within Android where I led a team that focuses on melding Google\u2019s various technology stack into an amazing experience that\u2019s meaningful to the user. Then figuring out how to build it and ship it.<\/p>\n<p data-block-key=\"bph1b\"><b>Since we\u2019re improving technologies and introducing new ones quite often, it seems like that would be a never-ending job.<\/b><\/p>\n<p data-block-key=\"67o2f\">Exactly! Within recent years, there\u2019s been this explosion in generative AI capabilities. At first when we started thinking about running large language models on devices, we thought it was kind of a joke \u2014 like, \u201cSure we can do that, but maybe by 2026.\u201d But then we began scoping it out, and the technology performance evolved so quickly that we were able to launch features using Gemini Nano, our on-device model, on Pixel 8 Pro in December 2023.<\/p>\n<p data-block-key=\"a9ehu\"><b>That\u2019s what I want to know more about: \u201con-device processing.\u201d Let\u2019s break it down and start with what exactly \u201cprocessing\u201d means.<\/b><\/p>\n<p data-block-key=\"72tmj\">The main processor, or system-on-a-chip (SoC), in your devices, has a number of what are called Processing Units designed specifically to handle the tasks you want to do with that device. That&#8217;s why you&#8217;ll see the chip (like the Tensor chip found in Pixels) referred to as a &#8220;system-on-a-chip: There&#8217;s not just one processor, but several processing units, memory, interfaces and much more, all together on one piece of silicon.<\/p>\n<p data-block-key=\"5bhrh\">Let\u2019s use Pixel smartphones as an example: The processing units include a Central Processing Unit, or CPU, as the main \u201cengine\u201d of sorts; a Graphics Processing Unit, or GPU, which renders visuals; and now today we have a Tensor Processing Unit, or TPU, specially designed by Google to run AI\/ML workloads on a device. These all work together to help your phone get things done \u2014 aka, processing.<\/p>\n<p data-block-key=\"bu7r1\">For example, when you take photos, you\u2019re often using all elements of your phone\u2019s processing power to good effect. The CPU will be busy running core tasks that control what the phone is doing, the GPU will be helping render what the lens is seeing and, on a premium Android device like a Pixel, there&#8217;s also a lot of work happening on the TPU to process what the optical lens sees to make your photos look awesome.<\/p>\n<p data-block-key=\"6vv5j\"><b>Got it. \u201cOn-device\u201d processing implies there\u2019s off-device. Where is \u201coff-device processing\u201d happening, exactly?<\/b><\/p>\n<p data-block-key=\"9ieru\">Off-device processing happens in the cloud. Your device connects to the internet and sends your request to servers elsewhere, which perform the task, and then send the output back to your phone. So if we wanted to take that process and make it happen on device, we\u2019d take the large machine learning model that powered that task in the cloud and make it smaller and more efficient so it can run on your device\u2019s operating system and hardware.<\/p>\n<p data-block-key=\"a2g3s\"><b>What hardware makes that possible?<\/b><\/p>\n<p>New, more powerful chipsets. For example, with the Pixel 9 Pro, that\u2019s happening thanks to our SoC called Tensor G4. Tensor G4 enables these phones to run models like Gemini Nano \u2014 it\u2019s able to handle these high-performance computations.<\/p>\n<p data-block-key=\"93qql\"><b>So basically, Tensor is designed specifically to run Google AI, which is<\/b> <b><i>also<\/i><\/b> <b>what powers a lot of Pixel\u2019s new gen AI capabilities.<\/b><\/p>\n<p data-block-key=\"9sj3p\">Right! And the generative AI features are definitely part of it, but there are lots of other things on-device processing makes possible, too. Rendering video, playing games, HDR photo editing, language translation \u2014 most everything you do with your phone. These are all happening on your phone, not being sent up to a server for processing.<\/p>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/blog.google\/technology\/ai\/on-device-processing\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Every time a new Pixel phone comes out, you might hear that \u201con-device processing\u201d makes its cool new features possible. Just take a look at the new Pixel 9 phones \u2014 things like Pixel Studio and Call Notes run \u201con device.\u201d And it\u2019s not just phones: Nest cameras, Pixel smartwatches and Fitbit devices also use [&hellip;]<\/p>\n","protected":false},"author":16,"featured_media":19405,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[100],"tags":[],"class_list":["post-19404","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-google"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/19404","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/16"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=19404"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/19404\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/19405"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=19404"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=19404"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=19404"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}