Categories
Feature

Heuristic evaluation for signals

In complex organisations, metrics and dashboards can reassure us even when things are quietly going wrong. A heuristic is not a tool for design alone — it is a way of asking better questions of your data, your processes, and your assumptions. This article shows a simple method for using one heuristic evaluation question to separate signal from noise.

In complex organisations, problems are rarely missed because there is no data.
They are missed because there is too much reassurance.

Dashboards glow green. Reports show progress. Meetings close with confidence.
And yet — quietly, persistently — something isn’t right.

A heuristic is not a design trick or a scoring method.
It is a thinking shortcut that helps you notice what matters before it becomes unavoidable.

This article introduces a simple heuristic you can use to separate signal from noise — especially when metrics are plentiful, comforting, and misleading.


When more data makes problems harder to see

Most organisations don’t lack measurement. They lack meaningful interpretation.

Over time, metrics tend to drift into one of three roles:

  • Reassurance – they make leaders feel confident
  • Compliance – they demonstrate process adherence
  • Defence – they justify decisions already taken

What they stop doing is changing judgement.

This is how organisations end up surprised by failures that were, in hindsight, “obvious”.

Not because nobody saw the signals —
but because the system trained people to treat those signals as noise.


A heuristic is a question, not a checklist

A heuristic is a deliberately simple question that focuses attention.

It does not replace judgement.
It creates the conditions for judgement.

The heuristic below can be applied to:

  • dashboards
  • KPIs
  • progress reports
  • status indicators
  • AI-generated summaries
  • any metric used to support decisions

The Signal Test (the core heuristic)

If this metric improved significantly tomorrow,
what decision would actually change?

Pause before answering.

If the honest answer is:

  • “Nothing”
  • “We’d feel more confident”
  • “It would look better in the report”

Then this metric is probably noise, not signal.

Signal is information that forces a reconsideration —
of priorities, actions, or assumptions.


Why this works (and why it feels uncomfortable)

This heuristic feels uncomfortable because it challenges three deeply embedded habits:

  1. Proxy comfort
    We mistake indicators about the work for indicators of the work.
  2. Narrative momentum
    Once a story of success forms, contradictory data feels disruptive.
  3. Risk displacement
    It becomes safer to question the metric than the reality it represents.

The heuristic doesn’t accuse anyone of failure.
It simply asks whether the metric is doing the job we claim it does.


A simple example

Imagine a programme dashboard showing “percentage complete” — consistently green.

Ask the heuristic question:

If “percentage complete” jumped by 10% tomorrow, what decision would change?

If the answer is:

  • No resourcing decision changes
  • No delivery approach changes
  • No risk conversation changes

Then the metric is performing a reassurance function, not a sensing function.

It may still be useful — but it is not telling you where to look next.


Heuristics are mental models, not scoring systems

In complex environments:

  • You can’t analyse everything
  • You can’t measure everything
  • You can’t foresee everything

Heuristics help by narrowing attention to what matters.

They:

  • expose hidden assumptions
  • surface uncomfortable questions
  • legitimise doubt early

Used well, they don’t slow organisations down,
they stop them running confidently in the wrong direction.


A lightweight heuristic prompt you can actually use

You don’t need a spreadsheet or a scoring sheet.

Use these two questions instead:

  1. If this metric improved tomorrow, what would change?
  2. If this metric got significantly worse, what would change?

If neither answer leads to a meaningful decision, escalation, or conversation –
treat the metric as context, not signal.

Then ask: what are we not measuring that would actually change how we act?


Why signals are often ignored even when they exist

Even when signals are present, organisations often fail to act because:

  • Qualitative information feels subjective
  • Exceptions are labelled “edge cases” or “outliers”
  • Raising concerns carries social or reputational risk
  • Metrics become targets rather than sensing tools

Over time, people learn which information is welcome –
and which is better left unsaid.

This is how silence becomes systemic.


Reflection: where might noise be masking signal for you?

Take a moment to reflect:

  • Which metric reassures you the most right now?
  • Which metric would you struggle to challenge in a meeting?
  • What information would actually change your next decision — but isn’t visible?

If this feels familiar, you’re not alone.
These patterns repeat across sectors, technologies, and organisations.


Related reading on Failure Hackers

If you want to explore this pattern further:

  • The Signal in the Noise – how dashboards can hide reality
  • The Culture of Silence – why risks go unspoken
  • What Is a Problem? – redefining what actually matters

These are some of the failure patterns we unpack live in the Failure Hackers sessions — one real breakdown, one missed signal, one better way to think.

Related posts: