Skip to main content

Some insight, we are heading for another great year in terms of a compliance cycle. Before this org and with other systems, I will admit either I dreaded the time of year or understood after nudging essentially the manager and user only. We would get on average to an 75-80% goal with learning campaigns before handing the story back to more local personnel...and then the local operations drove the last steps of compliance.

There is alot more to it (is “mandatory learning” defined at your org?, What change management techniques are being used up to and during the campaign, etc.) but I want to say after tens of large learning campaigns, they mostly fell into a resounding pattern with 30 day campaigns:

  • 25-30% engagement within the first week to week and a half. Showing you that your messaging was heard and you engaged the A-type personality type (if you put it in my queue I will get it done)
  • Sharing of dashboards and data for top down and push down approaches from executives got the message out to their folk...even more when their side of the org held them responsible - typically yielding a 7-10% response per push
  • Nudging the user with reminders yielded a 5-10% push
  • A bit of a rush at the due date

And so by day 30 we would see that goal achieved - end of story - and those numbers (80-90%) would resonate with me and I would tell the story to leaders. Some shared back without defining what is mandatory - you could not drive higher numbers. Others would push the limits - they would take measures to really track down folks that would not complete their learning to increase the completion rate after the due date.

Sharing, during our implementation? I was initially pushed to go towards an internal process of escalating reminders being fashioned. The first few learning campaigns in Docebo? I actually did not make it work with the internal process. And then later that year, we got that internal process working. And the yield of engagement per notification went up.

And so what I am trying to figure out - do you all have (do we have as a community) a set of common measures to check the performance of a learning campaign? I so want to learn more from ya if do.

I work at a human services agency, so there are several dozen mandatory trainings that are required by licensing, regulators, and our organization itself. We do measure engagement over time, including attempts and completions. Other KPIs are under discussion at the VP and C-level. 

As we starting pulling our Docebo data into our Snowflake data warehouse, we are focusing on dashboards and on-demand reports to give managers actionable data and insights. 


Reply