blog hero
AI

What Does It Mean to Design Products in the Age of AI?

What it really means to design in the age of AI. It's a mix of stories, real lessons (like what we learned while building a routing system at Leta), and a call to design with more empathy.

/title

There’s a lot of buzz today about how AI is changing the game for product design. Tools that generate wireframes in seconds. AI copilots that claim to turn ideas into screens. Industry experts predicting the end of the designer as we know it.

But amidst all this hype, I believe we’re missing a more important conversation — not about what AI can do, but what it should do. And more importantly, what it means to design in this new age.

You see, design has never just been about pushing pixels or making things look good (though good visuals matter). At its core, design is about people.

As Anthony Ulwick puts it in his Jobs to be Done framework:

People “hire” products to help them do a job.

And people aren’t isolated actors. We live in intricate, messy, and deeply social systems. Designing for a person means designing within their context — their environment, culture, limitations, routines, and goals.

Which brings me to something Mark Weiser, one of the pioneers of ubiquitous computing, once said:

“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”

That’s the bar. For AI to be useful — and even invisible — we must move past the surface. We must design AI products that understand, respect, and support real human life.

So what does that look like in practice?

1. Deep Understanding of People and Context

Knowing your user is no longer enough. You need to know where and how they live and work. What are their values? What’s their cultural context? What does trust look like to them? What do they fear?

For example, if you’re designing a generative AI product for a culture where modesty or religious values are deeply held, failing to consider that could easily lead to outputs that feel disrespectful or offensive — even harmful.

But understanding context also means understanding professional and emotional needs. If you’re designing a developer tool powered by AI, have you considered how the developer feels when their ideas are rewritten or overridden? Are you preserving their sense of authorship, or diminishing it? Does the tool acknowledge their intent, build on it, and give them a sense of psychological ownership?

The same goes for a doctor using a diagnostic AI tool. Are we designing systems that merely replace their judgment, or ones that amplify their insights — tools that recognize their effort, help them think more clearly by perhaps giving them pointers, and promote psychological ownership of decisions?

That’s what it means to truly understand people — not just their jobs, but their identities, their values, and how they relate to the tools they use.

Unfortunately, we see far too few designers asking these kinds of questions when creating AI-powered tools or agents.

Designing good AI products means embedding empathy and context into every layer of the experience.

2. Transparency by Design

AI systems are often black boxes. But people don’t trust what they don’t understand — especially in critical areas like healthcare, finance, or justice.

Imagine an AI loan system. One version simply says: “Loan denied.”
Another explains: “Loan denied due to income history and repayment risk score — see details.”

It's clear that the second version respects the user’s need for understanding. It invites questions, builds trust, and gives people a sense of fairness.

Or consider a doctor using an AI-powered X-ray scanner. One system gives a diagnosis with no explanation. Another highlights key areas on the scan and explains the patterns it detected, Making the doctor understand and make their own judgement of whether that is correct or not.

Transparency is not just good UX — it’s an ethical necessity.

3. Balancing Autonomy and Human Control

The world is not linear, and users don’t always want AI to take full control. A good system offers intelligent support but leaves room for human judgment.

When I was at Leta, we built the first version of a routing system to automate delivery paths. It worked — technically — but drivers in the field started overriding it frequently. At first, we thought the system needed better optimization. But through research, we realized something deeper: drivers were responding to real-world changes — sudden weather, road closures, local knowledge — that the system didn’t account for.

That experience taught us that autonomy must be flexible. The system needed to adapt to the human, not just expect the human to adapt to it. We redesigned it to allow easy manual adjustments and began capturing those overrides as learning data for future improvements.

Designing AI is about finding that balance: support, not replacement.

4. Continuous Feedback Loops

Uncertainty is stressful. Clear feedback reduces it. AI systems need to communicate progress, confirm actions, and invite corrections in a human-friendly way.

Think of a voice assistant that doesn’t just execute commands silently but confirms what it’s doing, or asks for clarification when unsure. That feedback loop reassures users and strengthens interaction.

So, What Else Should We Be Thinking About?

There are many more questions we, as designers, need to ask:

  • How do we design for inclusivity when AI systems are trained on biased data?
  • How do we embed ethics into everyday design decisions?
  • How do we make AI systems learn from users — without exploiting their data?

Design in the age of AI isn’t just about mastering tools. It’s about understanding humanity more deeply than ever before.

Because at the end of the day, good design isn’t about AI.
It’s about people. And they’re not going anywhere.


AI

Blogs

From the blog

Read other blogs on Tech, Design, Personal development and Startups

/undefined
Design

Duolingo Made Learning Fun, But Killed My Motivation

I explore what research says about rewards and motivation, how gamification helps and sometimes hurts real learning, what platforms like Duolingo could do differently to build sustainable meaningful habits

/undefined
Design

Cost-effecient design

Design as a piece in the cost of operating your software