Designing for AI products đŸ“±


Welcome back. Last week, startup Cognition AI came out with an AI-powered software engineer called Devin. It’s pretty mind-boggling stuff—the tool can complete entire coding projects (websites, apps, Chrome extensions, etc.) completely from scratch. Kinda like combining all the no-code tools we discussed last week in one platform...plus a lot more. It’s still in testing, but Devin definitely has me wondering: What will the “Devin” of design look like?

P.S. ChatGPT's new "Read Aloud" feature created the most passive-aggressive response of a designer chatting with PMs 😆

—Tommy (@DesignerTom)


The Wireframe:

  • New design realities in the AI era
  • What’s the deal with zero UI?
  • Intentional friction on Microsoft Copilot

How to Design for AI Patterns & Products

A bet I’m willing to make: You’re going to design an AI product or feature in the next year.

If you’re working in software, you’re about to be working in AI. In fact, a recent Figma survey showed that the number of designers, developers, and execs who think that AI will transform their products will double over the next year.

What that means: To build great AI products, designers need to recognize that AI isn't just an extension of existing design processes
it's an entirely new frontier. And today, we’re digging into four of the realities of this new era →

Heads up: I’m not going to talk about how to use AI in your design process today—you can read more about that here if you’re interested, though.

Reality 1: Research is more important than ever.

Recently, UX research has taken a backseat across software orgs. But AI represents a major paradigm shift that challenges existing mental models. So instead of feeling around in the dark, efficient orgs will look to UXR for answers.

UX researcher Claire Jin explores some of the ways that UX teams and researchers will adapt to address AI:

  • The “Wizard of Oz” method → Users interacting with what they *think* is an automated AI interface
but is actually controlled by a human behind the scenes.
  • Longitudinal studies → Understanding how user perception of AI interaction evolves over time.
  • Ethical assessments → Addressing challenges like bias, transparency, safety, and user control in AI development.

Reality 2: We’re still establishing a shared language.

Unlike traditional software, there is no shared language around AI patterns yet—but it’s starting to develop. Three examples of a this new AI pattern lexicon at orgs and companies?

#1: A designer on the Firefly team at Adobe told me that they think of AI patterns in terms of primary and secondary layers:

  • Primary layer patterns include “integrated” (AI built into existing features), “split” (e.g. a document editor on one side, AI chatbot on the other), “immersive” (AI wraps around the entire product), and “dialog” (chatbot).
  • Secondary layer patterns include triggers, response, system loading, highlight/focus of key info, auditing, and sources/fact-checking.

#2: Similarly, a design group at Microsoft established a UX/UI framework for presenting its AI tool, Copilot, in one of three ways:

  • The “immersive” mode presents Copilot as an overlay atop all relevant apps.
  • The “assistive” mode presents Copilot as a “sidecar” within a single app.
  • The “embedded” mode presents the AI as a small pop-up within an app's UI.

#3: Design legend Jakob Nielsen believes that AI introduces the first new UI paradigm in 60 years: intent-based outcome specification. In this approach, the user describes their desired result...but does not specify how to achieve this outcome →

  • The primary interaction is entering a text description of intent upfront—not clicking GUI elements.
  • A secondary interaction could include rounds of refinement, where the user issues additional prompts to improve the AI’s output.

Reality 3: Friction is your friend.

In other tech, the more we could “blackbox” the backend ops, the better. But with AI, the tech itself is blackboxed—and humans seek a sense of agency. This can be introduced with intentional friction.

Friction encourages human review, fact-checking, and collaboration with the AI system to 1) retain human agency and 2) produce the most accurate results. Some examples:

  • Pausing the operation to ask the user if the output is heading in the right direction.
  • Making room for more information that gives the user insight into the configurations and settings being used by the AI.
  • Turn-by-turn interactions between AI and humans such as conversational UIs (e.g. chat) that allow the user to steer the operation back toward the intended use case.

Reality 4: High-fidelity prototypes are necessary upfront.

AI is already challenging to visualize. But beyond that, deploying AI raises ethical considerations around transparency, privacy, and mitigating biases—and static prototypes simply won’t do enough to secure stakeholder buy-in. That’s why hi-fi is critical upon ideation, before you get into the design work.

With complex prototypes, you can:

  • Model the real-time updates as the AI generates outputs.
  • Demonstrate the feedback loops for iterative refinement.
  • Showcase safety guardrails like confirmation prompts.

Start with hi-fi, get buy-in, and then start the design work over in low-fi.

The takeaway: As AI upends tech, the design community has to stay adaptable. Designers today are setting the stage for the next generation of UX frameworks—acknowledge this responsibility, be open to change, and share with the community along the way.


TOGETHER WITH CODUX​

What If Designers and Developers Worked in Harmony?

As a designer, working with developers can often be a pain point. Instead of a hand-off between siloed teams, wouldn’t collaborating directly be ideal?

That’s why Codux was developed. Now, designers and development can bridge the gap with a familiar, visual environment. A local app, Codux works on top of the source code of your project, translating all of your visual changes directly into the code—and vice versa.

​Ready to unlock the dev environment for designers? Try it now.


News, Tools, and Resources: Designing for AI

  • I created this page-by-page teardown of 100 popular AI products.
  • A great example of longitudinal user testing to uncover new mental models in AI.
  • This post explains how to unleash AI’s potential with zero UI.
  • The three most common implementations of text-prompt UI in SaaS.
  • An awesome talk by Ioana Teleanu on crafting AI-powered experiences.
  • There were a ton of AI x UX talks at SXSW—check out the highlights.
  • If you want to learn more about Figma’s AI survey, register for its webinar here.

Got a great tool, podcast episode, idea, or something else? Hit reply and tell me what’s up.


Case Study: Microsoft Copilot’s Intentional Friction

Last year, Microsoft set the standard for conversational AI with Microsoft 365 Copilot. Copilot brings AI tech to all Microsoft 365 products—and thoughtfully introduces friction in each.

Here are three ways purposeful friction preserves human agency in Copilot:

​Analyzing trends in Excel. In this example, Copilot uses a chat dialog as a turn-by-turn interaction pattern that requires users to prompt the conversation. The output is a short synthesis—with a thoughtfully designed button to “Explain” the synthesis further.

​Content configurations in Word. In Word, Copilot provides a dismissible prompt popover with clear, editable properties that alter the output configuration. It then asks the human to keep, regenerate, delete, or adjust after content is generated.

​Image generation in Microsoft Designer. Here, Copilot allows the user to add reference media with prompt suggestions. There are multiple outputs generated, requiring the user to select the desired result (with a configurable resolution) before continuing—and then allowing for further customization in the designer.

As the design team explains, introducing appropriate friction prevents treating the AI as an “autopilot” and maintains the intended “copilot” relationship.


Thanks for reading! How else have you changed your UX mindset in the age of AI? Hit reply and let me know.

See you next week!

Enjoying this newsletter? Let us know here.

UX Tools

Practical lessons, resources, and news for the UX/UI community. Learn the real-world skills, methods, and tools that help you build user-first experiences. We make resources like practical tutorials, the Design Tools Survey, the Design Tools Database, and UX Challenges. Join 60k+ other designers and sign up for the newsletter to get product design mastery in just 5 minutes a week.

Read more from UX Tools

Welcome back. This week's gem: Mobbin's new abtest.design site. It's a goldmine of A/B test results from top apps, and I'm loving it. These resources are crucial for upping our design game. Seeing real-world UX wins and fails is invaluable, and reminds us why the design community rocks—we're always sharing knowledge. By being open about our hits and misses, we all get better. —Tommy (@DesignerTom) The Wireframe: Four (actually useful) design principles How to conduct a heuristic evaluation...

Welcome back. Framer just dropped a game-changer for creators: The new Marketplace Dashboard hands creators the keys to their own digital storefronts. As someone who's been in the trenches of design entrepreneurship, I can't overstate how huge this is. It's not just about organization—it's about empowerment. Whether you're a seasoned template maker or just starting out, this is your moment to turn side hustles into serious businesses. —Tommy (@DesignerTom) The Wireframe: How to build your...

Welcome back. Over the long weekend, I've been thinking about isolation in design—and how it can tank your career in this unpredictable tech landscape. That's why I'm doubling down on community and continuous learning. So here's your post-holiday challenge: As we dive back in, think about how you're pushing your craft forward. Join a Slack group. Hit a virtual meetup. Share what you know. Every bit counts. —Tommy (@DesignerTom) The Wireframe: Five tips for mastering micro interactions Create...