Measuring What Matters in Digital Coaching

In our age of on-demand data and mega databases, it’s a common mistake to think that ‘capturing everything’ is the key to insane amounts of insights down the line. Capturing too much of the wrong things is likely to yield fewer insights, because it increases the cost of curating and sorting through the data. In the words of the economist and political scientist Herbert Simon:

“In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients.”

What you see vs what’s happening

A concept I find very helpful to make better decisions about data capture and analytics in digital coaching is that of content creation as opposed to content consumption.

Content consumption happens whenever a user opens a learning module, reads a web page or opens a PDF. On the other hand, content creation is linked to users answering deep questions and formulating complex thoughts in reaction to the content they’re exposed to.

The monitoring and analytics strategies of elearning – all based on content consumption – tend to be applied to digital coaching, when they should actually be avoided. A consumption focus typically leverages raw usage data and quizzes at the end of modules – meaning you’ll end up with a battery of random metrics, all likely to give you a very detailed picture of what’s been happening on your digital platform, without understanding the long-term repercussions on your people.

Focus on content creation and you’ll recenter your efforts on what matters. You’ll need to:
1. Have a clear idea of the medium- and long-term outcomes of your digital coaching initiative
2. Start thinking about key signs of behavior change in your employees and leverage digital coaching tools that’ll let these signs be expressed, as well as understand how they’re moving your employees toward your long-term outcomes.

Stop monitoring, start questioning

Part of what I do as a data analyst is to help people think about the metrics of their projects, and over the dozen projects that I’ve worked on, one of the most consistent mistakes is that of focusing on the wrong metrics entirely.

One example was a client who wanted to assess the success of their digital coaching program based on how many times each module had been opened by their participants. However, they had made the modules mandatory! So they got 100% of their learner base opening the modules – on the surface, a shining result. But when taking a close look at the data, it appeared some participants had been taking a rather slapdash approach to the initiative and simply clicked through the content to achieve 100% completion rates.

This was a problem: in elearning it’s enough for a participant to open a module and get 100% on a test to call it a day – in effect you’re simply tracking consumption. In digital coaching, not so much, because this will tell you nothing of how your participants’ behaviors and mindsets have changed for the better.

We focus on consumption because it’s easier to see and talk about – but digital coaching isn’t about what’s evident! The question then is: what are my participants creating in the digital coaching process, which I can look at to understand the changes happening in their minds?

Use tools that measure outcomes, not usage

Focusing on platform usage is often the result of having a purely tools-based approach to digital coaching. If your starting point is that you want to leverage a Google Analytics integration for your data needs, chances are you’ll immediately think in terms of page views and dropout rates, because that’s what’s available through this particular tool. But these metrics will only ever tell you whether your users are consuming what’s available on the platform, and nothing much else.

By focusing on the tool first, you’re constraining your thinking and losing the opportunity to generate great insights from untapped data sources (in this regard, our data team loves to follow the Spine Model to steer away from tools-based data approaches).

What would these untapped data sources be? Starting from the needs and desired outcomes of your digital coaching program is key here – more than likely they’ll be behavioral: mindset changes, new thinking, improved creativity. While no amount of usage data will tell you whether your participants have developed greater leadership skills, tools that capture and analyse the particular interactions of your participants with the content will deliver invaluable insights on these topics.

For example, functionalities you can look for in digital coaching solutions are:
1. Possibility to develop mindset and behavioral surveys to be correlated with actual business outcomes and company values
2. Follow-through mechanisms: to help participants reflect regularly on their coaching, asking them how they perceive their behaviors to have evolved and letting them give examples of behavior change
3. Focus on advanced analyses of learner input: leveraging linguistic analyses to detect key trends in user thinking and help understand mindset shifts.

Behavior change is a long-term outcome

The main problem you’ll face when focusing on content creation is that it never yields quick wins. Mindset changes are always a long-term process, whose effects aren’t always evident and clear. This may seem like a curse – but in reality, it pushes to have a more long-term, strategic view for your data collection and analytics.

What long-term behavior changes would you like to track and develop with digital coaching?