{"id":19929,"date":"2025-01-24T23:19:54","date_gmt":"2025-01-24T23:19:54","guid":{"rendered":"http:\/\/scannn.com\/facebook\/accelerating-the-future-ai-mixed-reality-and-the-metaverse\/"},"modified":"2025-01-24T23:19:54","modified_gmt":"2025-01-24T23:19:54","slug":"accelerating-the-future-ai-mixed-reality-and-the-metaverse","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/accelerating-the-future-ai-mixed-reality-and-the-metaverse\/","title":{"rendered":"Accelerating the Future: AI, Mixed Reality and the Metaverse"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p><span style=\"font-weight: 400\">When we first began giving demos of <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/orion-ar-glasses-augmented-reality\/\"><span style=\"font-weight: 400\">Orion<\/span><\/a><span style=\"font-weight: 400\"> early this year I was reminded of a line that you hear a lot at Meta \u2013 in fact it was even in our first letter to prospective shareholders back in 2012: <\/span><a href=\"https:\/\/money.cnn.com\/2012\/02\/01\/technology\/zuckerberg_ipo_letter\/index.htm?iid=EAL\"><span style=\"font-weight: 400\">Code wins arguments<\/span><\/a><span style=\"font-weight: 400\">. We probably learned as much about this product space from a few months of real-life demos than we did from the years of work it took to make them. There is just no substitute for actually building something, putting it in people\u2019s hands and learning from how they react to it.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Orion wasn\u2019t our only example of that this year. Our <\/span><a href=\"https:\/\/www.meta.com\/quest\/\"><span style=\"font-weight: 400\">mixed reality hardware<\/span><\/a><span style=\"font-weight: 400\"> and <\/span><a href=\"https:\/\/www.meta.com\/smart-glasses\"><span style=\"font-weight: 400\">AI glasses<\/span><\/a><span style=\"font-weight: 400\"> have both reached a new level of quality and accessibility. The stability of those platforms allows our software developers to move much faster on everything from operating systems to new AI features. This is how I see the metaverse starting to come into greater focus and why I\u2019m so confident that the coming year will be the most important one in the history of Reality Labs.\u00a0<\/span><\/p>\n<p><strong>2024 was the year AI glasses hit their stride.<\/strong> <span style=\"font-weight: 400\">When we first started making smart glasses in 2021 we thought they could be a nice first step toward the AR glasses we eventually wanted to build. While mixed reality headsets are on track to becoming a general purpose computing platform much like today\u2019s PCs, we saw glasses as the natural evolution of today\u2019s mobile computing platforms. So we wanted to begin learning from the real world as soon as possible.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">The biggest thing we\u2019ve learned is that glasses are by far the best form factor for a truly AI-native device. In fact they might be the first hardware category to be completely defined by AI from the beginning. For many people, glasses are the place where an AI assistant makes the most sense, especially when it\u2019s a <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/ray-ban-meta-glasses-collection-news-updates-connect-2024\/\"><span style=\"font-weight: 400\">multimodal system<\/span><\/a><span style=\"font-weight: 400\"> that can truly understand the world around you.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">We\u2019re right at the beginning of the S-curve for this entire product category, and there are endless opportunities ahead. One of the things I\u2019m most excited about for 2025 is the evolution of AI assistants into tools that don\u2019t just respond to a prompt when you ask for help but can become a proactive helper as you go about your day. At Connect we showed how Live AI on glasses can become more of a real-time participant when you\u2019re getting things done. As this feature <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/ray-ban-meta-v11-software-update-live-ai-translation-shazam\/\"><span style=\"font-weight: 400\">begins rolling out in early access this month<\/span><\/a><span style=\"font-weight: 400\"> we\u2019ll see the first step toward a new kind of personalized AI assistant.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">It won\u2019t be the last. We\u2019re currently in the middle of an industry-wide push to make AI-native hardware. You\u2019re seeing phone and PC makers scramble to rebuild their products to put AI assistants at their core. But I think a much bigger opportunity is to make devices that are AI-native from the start, and I\u2019m confident that glasses are going to be the first to get there. Meta\u2019s Chief Research Scientist Michael Abrash has been talking about the potential of a personalized, context-aware AI assistant on glasses <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/inventing-the-future\/\"><span style=\"font-weight: 400\">for many years<\/span><\/a><span style=\"font-weight: 400\">, and building the technologies that make it possible has been a huge focus of our research teams. <\/span><\/p>\n<p><span style=\"font-weight: 400\"><strong>Mixed reality<\/strong> has been another place where having the right product in the hands of a lot of people has been a major accelerant for progress. We\u2019ve seen <\/span><a href=\"https:\/\/www.meta.com\/quest\/quest-3\/\"><span style=\"font-weight: 400\">Meta Quest 3<\/span><\/a><span style=\"font-weight: 400\"> get better month after month as we continue to iterate on the core system passthrough, multitasking, spatial user interfaces and more. All these gains extended to <\/span><a href=\"https:\/\/www.meta.com\/quest\/quest-3s\/\"><span style=\"font-weight: 400\">Quest 3S<\/span><\/a><span style=\"font-weight: 400\"> the moment it launched. This meant the $299 Quest 3S was in many respects a better headset on day 1 than the $499 Quest 3 was when it first launched in 2023. And the entire Quest 3 family keeps getting better with <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/meta-quest-v72-update-remote-desktop-hand-tracking-keyboard-more\"><span style=\"font-weight: 400\">every update<\/span><\/a><span style=\"font-weight: 400\">.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">In 2025 we\u2019ll see the next iteration of this as Quest 3S brings a lot more people into mixed reality for the first time. While Quest 3 has been a hit among people excited to own the very best device on the market, we\u2019re seeing that Quest 3S is <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/why-you-should-buy-a-quest\/\"><span style=\"font-weight: 400\">our most giftable headset<\/span><\/a><span style=\"font-weight: 400\"> yet. Sales were strong over the Black Friday weekend, and we\u2019re expecting a surge of people activating their new headsets over the holiday break.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">This will continue a trend that took shape over the last year: growth in new users <\/span><span style=\"font-weight: 400\">who are seeking out a wider range of things to do with their headsets. We want these new users to see the magic of MR and stick around for the long run, which is why we\u2019re funding developers to build the new types of apps and games they\u2019re looking for.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">There\u2019s been a significant influx of younger people in particular, and they have been gravitating toward social and competitive multiplayer games, as well as freemium content. And titles like <\/span><a href=\"https:\/\/www.meta.com\/experiences\/skydances-behemoth\/5767635983306831\/\"><i><span style=\"font-weight: 400\">Skydance\u2019s BEHEMOTH<\/span><\/i><\/a><span style=\"font-weight: 400\">, <\/span><a href=\"https:\/\/www.meta.com\/experiences\/batman-arkham-shadow\/3551691271620960\/\"><i><span style=\"font-weight: 400\">Batman: Arkham Shadow<\/span><\/i><\/a><span style=\"font-weight: 400\"> (which just won Best AR\/VR game at The Game Awards!), and <\/span><a href=\"https:\/\/www.meta.com\/experiences\/metro-awakening\/5096918017089406\/\"><i><span style=\"font-weight: 400\">Metro Awakening<\/span><\/i><\/a><span style=\"font-weight: 400\"> showed once again that some of the best new games our industry produces are now only possible on today\u2019s headsets.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">As the number of people in MR grows, the quality of the social experience it can deliver is growing in tandem. This is at the heart of what Meta is trying to achieve with Reality Labs and where the greatest potential of the metaverse will be unlocked: \u201cthe chance to create the most social platform ever\u201d is how we described it when we <\/span><a href=\"https:\/\/about.fb.com\/news\/2014\/03\/facebook-to-acquire-oculus\/\"><span style=\"font-weight: 400\">first began working on it<\/span><\/a><span style=\"font-weight: 400\">. We took two steps forward on this front in 2024: first with a broad set of improvements to Horizon Worlds including its expansion to mobile, and second with the next-generation Meta Avatars system that lets people represent themselves across our apps and headsets. And as the visual quality and overall experience with these systems improve, more people are getting their first glimpse of a social metaverse. We\u2019re seeing similar trends with new Quest 3S users spending more time in Horizon Worlds, making it a Top 3 immersive app for Quest 3S, and people continue creating new Meta Avatars across mobile and MR.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Getting mixed reality into the mainstream has helped illuminate what will come next. One of the first trends that we discovered after the launch of Quest 3 last year was people using mixed reality to watch videos while multitasking in their home\u2014doing the dishes or vacuuming their living rooms. This was an early signal that people love having a big virtual screen that you can take anywhere and place in the physical world around you. We\u2019ve seen that trend take off with all sorts of entertainment experiences growing fast across the whole Quest 3 family. <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/youtube-co-watch-party-vr-mixed-reality\/?srsltid=AfmBOoraQ4rAYD0N9z0RUXuavBaqGOgr49RGTlpHmrWDAHa-xcy6evSO\"><span style=\"font-weight: 400\">New features like YouTube Co-Watch<\/span><\/a><span style=\"font-weight: 400\"> show how much potential there is for a whole new kind of social entertainment experience in the metaverse.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">That\u2019s why James Cameron, one the most technologically innovative storytellers of our lifetimes, is now working to help more filmmakers and creators <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/james-cameron-lightstorm-vision-partnership\/\"><span style=\"font-weight: 400\">produce great 3D content for Meta Quest<\/span><\/a><span style=\"font-weight: 400\">. While 3D films have been produced for decades, there\u2019s never been a way to view them that\u2019s quite as good as an MR headset, and next year more people will own a headset than ever before. \u201cWe\u2019re at a true, historic inflection point,<\/span><i><span style=\"font-weight: 400\">\u201d<\/span><\/i> <a href=\"https:\/\/www.threads.net\/@jamescameronofficial\/post\/DDNPBvgS8wo?xmt=AQGzo-jHEFO2bxyYwH6Nk8AZBzT8uCWQas-tvn5S9D6OYw\"><span style=\"font-weight: 400\">Jim said<\/span><\/a><span style=\"font-weight: 400\"> when we <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/james-cameron-lightstorm-vision-partnership\"><span style=\"font-weight: 400\">launched our new partnership<\/span><\/a><span style=\"font-weight: 400\"> this month.<\/span><\/p>\n<p><span style=\"font-weight: 400\">This is happening alongside a larger shift Mark Rabkin <\/span><a href=\"https:\/\/developers.facebook.com\/m\/meta-connect-developer-sessions\/developer-keynote\/\"><span style=\"font-weight: 400\">shared at Connect this year<\/span><\/a><span style=\"font-weight: 400\">: Our vision for Horizon OS is to build a new kind of general purpose computing platform capable of running every kind of software, supporting every kind of user, and open to every kind of creator and developer. Our recent releases for 2D\/3D multi-tasking, panel positioning, better hand tracking, Windows Remote Desktop integration and Open Store have started to build momentum along this new path. Horizon OS is on track to be the first platform that supports the full spectrum of experiences from immersive VR to 2D screens, mobile apps and virtual desktops \u2013 and the developer community is central to that success.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\"><strong>The next big step toward the metaverse<\/strong> will be combining AI glasses with the kind of true augmented reality experience we revealed this year with Orion. It\u2019s not often that you get a glimpse of the future and see a totally new technology that shows you where things are heading. The people who saw Orion immediately understood what it meant for the future much like the people who saw first personal computers taking shape at Xerox PARC in the 1970s (\u201cwithin ten minutes it was obvious to me that all computers would work like this someday,\u201d Steve Jobs later said of his 1979 demo of the Xerox Alto).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Being able to put people in a time machine and show them how the next computing platform will look was a highlight of 2024 \u2013 and of my career so far. But the real impact of Orion will be in the products we ship next and the ways it helps us better understand what people love about AR glasses and what needs to get better. We spent years working on user research, product planning exercises, and experimental studies trying to understand how AR glasses should work, and that work is what enabled us to build Orion. But the pace of progress will be much more rapid from here on out now that we have a real product to build our intuition around.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">This has been the lesson time and again <\/span><a href=\"https:\/\/www.meta.com\/blog\/quest\/reality-labs-10-year-anniversary-next-computing-platform-vr-mr-ar-xr\/\"><span style=\"font-weight: 400\">over the last decade at Reality Labs<\/span><\/a><span style=\"font-weight: 400\">. The most important thing you can do when you\u2019re trying to invent the future is to ship things and learn from how real people use them. They won\u2019t always be immediate smash hits, but they\u2019ll always teach you something. And when you land on things that really hit the mark, like mixed reality on Quest 3 or AI on glasses, that\u2019s when you put your foot on the gas. This is what will make 2025 such a special year: With the right devices on the market, people experiencing them for the first time and developers discovering all the opportunities ahead, it\u2019s time to accelerate. <\/span><\/p>\n<\/p><\/div>\n<p><script async defer crossorigin=\"anonymous\" src=\"https:\/\/connect.facebook.net\/en_US\/sdk.js#xfbml=1&#038;version=v5.0\"><\/script><br \/>\n<br \/><br \/>\n<br \/><a href=\"https:\/\/about.fb.com\/news\/2024\/12\/accelerating-the-future-ai-mixed-reality-and-the-metaverse\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>When we first began giving demos of Orion early this year I was reminded of a line that you hear a lot at Meta \u2013 in fact it was even in our first letter to prospective shareholders back in 2012: Code wins arguments. We probably learned as much about this product space from a few [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":19930,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[123],"tags":[],"class_list":["post-19929","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-facebook"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/19929","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=19929"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/19929\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/19930"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=19929"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=19929"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=19929"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}