Shortwave: AI fatigue and the long middle of adoption
A field report from the people watching users closely.
In its State of UX 2026 report, the Nielsen Norman Group (NN/g) includes a section titled “AI: From Hype to Fatigue.”
This matters in a very specific way. Donald Norman is one of the foundational figures of human-computer interaction, and when his firm speaks, the industry tends to listen. Not because the message is loud, but because it is usually grounded in observation rather than speculation.
The section is not framed as a takedown of AI, nor as a retreat. It reads more like a status update from the field, and takes the position that we are at a moment of recalibration after several years of accelerated enthusiasm.
Checking signal strength
What the report calls out is a normalization phase, when a technology stops being judged by its promise and starts being judged by its performance in real settings. It does this by naming a set of friction points first.
The friction points they surface are practical and familiar. AI features added because competitors added them. Tools that work well in demos but poorly in lived workflows. Automation introduced into situations that still demand human judgment, without clarifying where responsibility now sits. These are not failures of capability. They are signs of immature integration.
This pattern of vapid early adoption followed by measured implementations has appeared before. After the dot-com boom, when everyone made the mistaken assertion they all had to have a website the web did not crumble. Instead, teams stopped asking whether they should be online and started asking how to design systems that were usable, sustainable, and trustworthy. The same thing happened with mobile, with service-oriented architectures, and with early enterprise platforms. Early enthusiasm was followed by stock-taking and maturing of the use of the new technology.
NN/g’s use of the word “fatigue” signals that same transition. UX teams are no longer reacting to novelty. They are responding to consequences. That shift represents progress.
Capturing the return signal
What makes this moment notable is how clearly NN/g ties fatigue to user experience rather than tooling preference. Users, they note, are increasingly impatient with superficial AI additions and loosely defined “AI-powered” features. The issue is not accuracy alone. It is coherence, intent, and trust.
In this framing, fatigue is a feedback mechanism. It reflects a growing sensitivity to when systems feel thoughtful versus opportunistic. When AI recedes into the background and supports existing goals, users respond positively. When it asserts itself without clear purpose, annoyance increases.
Rather than asking where AI can be inserted, teams are encouraged to ask where it genuinely belongs and apply long-standing design skills as needed.
Logging the frequency
AI fatigue is not a signal to slow down. It is a sign that the field is settling into more durable work.


