We believe technology is at its best when it works for everyone. That’s especially true when it comes to accessibility. For too long, people have had to adapt to technology — we want to build technology that adapts to them.
That’s the idea behind Natively Adaptive Interfaces (NAI), an approach that uses AI to make accessibility a product’s default, not an afterthought. The goal of our research is to build assistive technology that is more personal and effective from the beginning.
How Natively Adaptive Interfaces work
Instead of building accessibility features as a separate, “bolted-on” option, NAI bakes adaptability directly into a product’s design from the beginning. For instance, an AI agent built with the NAI framework can help you accomplish tasks with your guidance and oversight, intelligently reconfiguring itself to deliver a more accessible, personalized experience. In our research of prototypes that helped to validate this framework, a main AI agent could be used to understand your overall goal and then work with smaller, specialized agents to handle specific tasks — like making a document more accessible by adjusting the UI and scaling text for a more personalized experience. For example, it might generate audio descriptions for someone who is blind or simplify a page’s layout for someone with ADHD.
This often creates a “curb-cut effect,” where a feature designed for a specific need ends up being helpful for everyone. A voice-controlled app designed for someone with motor disabilities, for instance, can also help a parent holding a child.
Building with and for people with disabilities
The NAI framework is guided by the core principle: “Nothing about us, without us.” Developers collaborate with the disability community throughout their design and development process, ensuring the solutions they create are both useful and usable. With support from Google.org, we’re funding leading organizations that serve disability communities — like the Rochester Institute of Technology’s National Technical Institute for the Deaf (RIT/NTID), The Arc of the United States, RNID and Team Gleason — to build adaptive AI tools for their communities that will solve real-world friction points.







