{"id":21708,"date":"2026-04-08T16:05:02","date_gmt":"2026-04-08T16:05:02","guid":{"rendered":"https:\/\/scannn.com\/msls-first-model-purpose-built-to-prioritize-people\/"},"modified":"2026-04-08T16:05:02","modified_gmt":"2026-04-08T16:05:02","slug":"msls-first-model-purpose-built-to-prioritize-people","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/msls-first-model-purpose-built-to-prioritize-people\/","title":{"rendered":"MSL\u2019s First Model, Purpose-Built to Prioritize People"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p><span style=\"font-weight: 400;\">Today we are announcing Muse Spark, the first in a new series of large language models built by Meta Superintelligence Labs. We are on our way to personal superintelligence: an assistant that can help anyone, anywhere with the things that matter most to them.<\/span><\/p>\n<h2>A New Model: Muse Spark<\/h2>\n<p><span style=\"font-weight: 400;\">Over the last nine months, Meta Superintelligence Labs rebuilt our AI stack from the ground up, moving faster than any development cycle we have run before. <\/span><a href=\"http:\/\/ai.meta.com\/blog\/introducing-muse-spark-msl\"><span style=\"font-weight: 400;\">Muse Spark<\/span><\/a><span style=\"font-weight: 400;\"> is the first model in our new Muse series \u2014 a deliberate and scientific approach to model scaling where each generation validates and builds on the last before we go bigger. This initial model is small and fast by design, yet capable enough to reason through complex questions in science, math, and health. It is a powerful foundation, and the next generation is already in development. Muse Spark now powers the Meta AI assistant in the Meta AI app and meta.ai, built to support complex reasoning and multimodal tasks.<\/span><\/p>\n<h2>What\u2019s Changed With Meta AI<\/h2>\n<p><span style=\"font-weight: 400;\">The Meta AI app and meta.ai are getting an upgrade today, along with a new look. Whether you need a quick answer or help with complex problems that need strong reasoning, Meta AI now handles both. You can switch between modes depending on the task, and Meta AI can launch multiple subagents in parallel to tackle your question. Like planning a family trip to Florida where one agent drafts the itinerary, another compares Orlando vs. the Keys, and a third finds kid-friendly activities \u2014 all at the same time, giving you a better answer, faster.\u00a0<\/span><\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/01_Subagent-1.gif\"><img loading=\"lazy\" data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-47806\" src=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/01_Subagent-1.gif?resize=960%2C836\" alt=\"Phone screen capture showing Meta AI subagents\" width=\"960\" height=\"836\"\/><\/a><\/p>\n<h2>Ask Meta AI: It Understands<\/h2>\n<p><span style=\"font-weight: 400;\">The real world moves fast, and most of it does not fit into a text box. That is why we built strong multimodal perception into Muse Spark, so Meta AI can see and understand what you are looking at, not just read what you type. Snap a photo of an airport snack shelf and Meta AI can identify and rank the snacks with the most protein \u2014 no label-squinting required. Scan a product and ask how it compares to alternatives. It is the difference between an AI that waits for you to explain the world and one that can simply look at the world with you. And when Meta AI powered by Muse Spark comes to our AI glasses, the assistant will be able to better see and understand the world around you.<\/span><\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg\"><img loading=\"lazy\" data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-47793\" src=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?resize=960%2C836\" alt=\"Two phone screens showing a chat with Meta AI estimating calorie count of food shown in a photo\" width=\"960\" height=\"836\" srcset=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=1920 1920w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=300 300w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=768 768w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=1024 1024w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=1536 1536w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=1240 1240w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/02_Nutrition.jpg?w=689 689w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\"\/><\/a><\/p>\n<p><span style=\"font-weight: 400;\">Multimodal perception is especially valuable for health. With Muse Spark, Meta AI is now able to help you navigate health questions with more detailed responses, including some questions involving images and charts. Health is one of the top reasons people turn to AI, so we worked with a team of physicians to develop the model\u2019s ability to provide helpful information on common health questions and concerns.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Muse Spark excels at visual coding, letting you create custom websites and mini-games straight from a prompt. Ask Meta AI to build a dashboard for planning a big surprise party, spin up a retro arcade game to chase a high score, or launch a whimsical flight simulator \u2014 and share any of them with friends.<\/span><\/p>\n<h2>Ask Meta AI: It\u2019s Plugged Into What You Care About<\/h2>\n<p><span style=\"font-weight: 400;\">Meta AI can now help you discover what to wear, how to style a room, or what to buy for someone you know. Shopping mode draws from the styling inspiration and brand storytelling already happening across our apps, surfacing ideas from the creators and communities people already follow.<\/span><\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/03_Shopping.gif\"><img loading=\"lazy\" data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-47795\" src=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/03_Shopping.gif?resize=960%2C836\" alt=\"Phone screen recording showing Meta AI Shopping mode \" width=\"960\" height=\"836\"\/><\/a><\/p>\n<p><span style=\"font-weight: 400;\">And when you are looking up a place to go or a topic that is trending, Meta AI surfaces rich and relevant context right alongside the conversation. Tap into a location and see public posts from locals who know the area. Ask what people are buzzing about and get the full picture, pulled from content and community posts. It is context from your people, right where you need it.<\/span><\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/04_Search.gif\"><img loading=\"lazy\" data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-47796\" src=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/04\/04_Search.gif?resize=960%2C836\" alt=\"Two phone screens showing Meta AI search\" width=\"960\" height=\"836\"\/><\/a><\/p>\n<h2><b>Looking Ahead<\/b><\/h2>\n<p><span style=\"font-weight: 400;\">The Meta AI app and meta.ai will have the upgraded experience with Instant and Thinking modes everywhere they are available today. The new Meta AI features are starting to roll out in the US on both. In the coming weeks, we will bring these new modes and capabilities to more countries and to the places where people use Meta AI, including Instagram, Facebook, Messenger, WhatsApp, and our AI glasses \u2014 where these perception capabilities become even more powerful. We are also opening access to the underlying technology. It will be available in private preview via API to select partners, and we hope to open-source future versions of the model.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is only the start. As we expand these features, expect richer, more visual results, with Reels, photos, and posts woven directly into your answers, with credit back to the content creators. And as our models improve, we\u2019ll continue to build <\/span><a href=\"https:\/\/ai.meta.com\/blog\/scaling-how-we-build-test-advanced-ai\/\"><span style=\"font-weight: 400;\">safeguards<\/span><\/a><span style=\"font-weight: 400;\"> for things like safety and privacy, starting with the strengthened risk framework and other protections we\u2019re sharing today.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The future of Meta AI is rooted in the relationships and context already at the center of your life. We are building toward personal superintelligence \u2014 an AI that does not just answer your questions but truly understands your world because it is built on it.<\/span><\/p>\n<\/p><\/div>\n<p><script async defer crossorigin=\"anonymous\" src=\"https:\/\/connect.facebook.net\/en_US\/sdk.js#xfbml=1&#038;version=v5.0\"><\/script><br \/>\n<br \/><br \/>\n<br \/><a href=\"https:\/\/about.fb.com\/news\/2026\/04\/introducing-muse-spark-meta-superintelligence-labs-first-model-built-to-prioritize-people\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Today we are announcing Muse Spark, the first in a new series of large language models built by Meta Superintelligence Labs. We are on our way to personal superintelligence: an assistant that can help anyone, anywhere with the things that matter most to them. A New Model: Muse Spark Over the last nine months, Meta [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":21709,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[123],"tags":[],"class_list":["post-21708","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-facebook"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/21708","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=21708"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/21708\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/21709"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=21708"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=21708"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=21708"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}