Are you doing curriculum wide feedback/assessment?

  • 19 September 2022
  • 17 replies
  • 119 views

Userlevel 7
Badge +3

Curious what if any, assessment of curriculum folks are doing within their systems.

Is it formal or informal?

Is it at the course, learning plan, different level?

Does anyone actually look at the reporting/feedback from it?

Any good stories of changes due to it?

Are you doing any curriculum wide assessment?


17 replies

Userlevel 7
Badge +5

I write or redesign curriculum based on feedback - basically just level 1 and 2 stuff.  I do like to analyze test responses longitudinally as a way to identify gaps in content coverage or review certain questions that are maybe just confusing and need to be rewritten.

I have noticed that sometimes test questions will start to have a higher wrong answer count when we swap out an instructor and we have to have another chat about what the main take-aways are supposed to be. Some instructors have decided to go rogue and visit all sorts of tangents. They don’t teach as much anymore. LOL

I suppose you might consider this an informal, course/assessment level review.

I look at every single survey filled out.

Userlevel 7
Badge +3

So you have a survey for every course and every attempt?

Userlevel 7
Badge +5

A survey is offered for every course but we do not have 100% participation. We don’t make the surveys mandatory. Our face-to-face courses have a much higher submission rate compared to online.

Userlevel 7
Badge +3

Interesting, the face to face session, you do them in the room or they get them later?

Our VILT’s have an every session survey after them as not optional too, but I feel like it ends up too focused on one area and also tend to see the same users give feedback so it skews the results as several will do all of them the same week and put the same feedback in. 

I feel like every course gets aggressive to a degree and have been debating more of a progress through learning plans to trigger different surveys. Just thinking about it.

Userlevel 7
Badge +5

In face to face - they get a survey at the end - in the room.

To be fair - I use the word “course” perhaps more broadly than I have seen it becoming.

I mean - nowadays you can upload a video and call it a “course”.

For me that is at best, a lesson. More often, it is simply a part of a lesson.

To me, a course is made up of several lessons.

Our online courses have several hours worth of audio, video, text, interactions, and assessments. On top of that - they all include a downloadable study guide to accompany the materials.

One survey at the end of all that.

Userlevel 7
Badge +3

Haha, totally understand and like the differential.

Userlevel 3

A survey is offered for every course but we do not have 100% participation. We don’t make the surveys mandatory. Our face-to-face courses have a much higher submission rate compared to online.

Same trend observed here. We have a survey after completing a learning plan (there are three main learning plans so it’s not a death-by-survey kind of situation) and a survey at the end of face-to-face training. Face-to-face surveys are in the room, but not mandatory. Participation in those surveys is nearly 100%. Participation in learning plan surveys is around 35%.

Surveys have been really useful in identifying demands from our users and gaps in our curriculum. For example, we created a dedicated video team because a significant number of survey participants asked for more short tutorial videos. We’ve also gotten some nice soundbites out of them to remind upper management that people like having us around (“This is the best online training I’ve ever taken”, “This training saved my life”, etc.). :)

We’re not gonna get rid of surveys anytime soon, but overall I prefer the information we get out of our course beta testers, mapping training to business-related results (sales, technical support cases, etc.), and xAPI data (user behavior in online courses).

Userlevel 7
Badge +3

Nice! Does the in person survey type focus on similar things as the learning plan overall, or focus on different criteria? I imagine there’s more interest in the instructor/presenter at that point.

 

Userlevel 3

Nice! Does the in person survey type focus on similar things as the learning plan overall, or focus on different criteria? I imagine there’s more interest in the instructor/presenter at that point.

 

We reuse almost all the questions from the online training survey for the in-person one. That way we can do some direct comparisons (for example, how motivated they are to apply what they’ve learned in an online experience vs. in-person training) And we’ve added a couple of questions about the instructor and the training environment.

Userlevel 3

I’m kind of wondering: @Bfarkas, @gstager (and anyone else): are you using the surveys built into docebo? I’ve never had much luck with the reporting functionality on those, so we are iframing in Microsoft Forms.

Userlevel 7
Badge +3

I am not, I wanted to leverage our existing survey platform for a few reasons but mainly:

  • Keep all the survey data together and more easily funneled into dashboards/reports through official channels at the company rather than dealing with exports.
  • Pulling in other information about the user into the survey automatically that may not be in docebo.
  • I use a trick with the OAuth app to trigger api calls off of the survey embeds too in order to identify the user and to trigger other actions based on the survey being used/submitted/processed/branched, etc.

Honestly, based on some of the threads on here about the shortcomings of the built in survey tool, I almost find them unusable for anything of scale currently.

Userlevel 7
Badge +3

Poking this back up to see if get some more discussion on it...

Userlevel 5
Badge +1

I’d like to do some user pre- and post-surveys in our e-learning courses. Any suggestions on third-party services that integrate well with Docebo? 

Userlevel 7
Badge +3

I guess it comes down to what you define as ‘integrated’. I do a trick with external surveys that embeds them into course widgets or custom page widgets and delivers the course id, page id, user id in the URL so can be brought into the survey results. I do all the analysis over in that platform which is better suited, not really integrated but pull enough data across to know what’s happening.

Userlevel 4
Badge +1

Hello @Bfarkas , what´s the best way to create a survey in the learning plan?

Put one “course” with a survey object in the learning plan, or in the last learning plan course, insert two survey object. One for course evaluation, and another one for a learning plan evaluation?

What´s your opinion?

Userlevel 6
Badge +1

We use Qualtrics as our evaluation tool as we have an organisational licence.  We capture evaluation feedback for every single session (ILT).  If its in person we have QR codes that people can scan with their phones to give feedback (we used to have paper forms but then the admin team had to manually enter the data into Qualtrics).

If people don't use the QR codes then we also have a auto notification when a course is ‘completed’ which thanks them for attending an also has a link to the survey.  We encourage our trainers to get them to use the QR codes whilst there as its very likely they will ignore the link in the notification.

For e-learning courses we just have the auto completion notification and link to the survey.

For ILT we have a generic form for each session, and the results are sent to the trainer a few days after the session date (when we think we are unlikely to get ay more responses)

For both ILT and e-learning we periodically check the feedback an do an annual review.

For learning plans (or for programmes) we have bespoke survey specific to the plan (again using Qualtrics) that we just send out after the plan/programme has finished. 

Userlevel 7
Badge +3

We use Qualtrics as our evaluation tool as we have an organisational licence.  We capture evaluation feedback for every single session (ILT).  If its in person we have QR codes that people can scan with their phones to give feedback (we used to have paper forms but then the admin team had to manually enter the data into Qualtrics).

If people don't use the QR codes then we also have a auto notification when a course is ‘completed’ which thanks them for attending an also has a link to the survey.  We encourage our trainers to get them to use the QR codes whilst there as its very likely they will ignore the link in the notification.

For e-learning courses we just have the auto completion notification and link to the survey.

For ILT we have a generic form for each session, and the results are sent to the trainer a few days after the session date (when we think we are unlikely to get ay more responses)

For both ILT and e-learning we periodically check the feedback an do an annual review.

For learning plans (or for programmes) we have bespoke survey specific to the plan (again using Qualtrics) that we just send out after the plan/programme has finished. 

Love this, Do many similar but at a Learning Plan level to avoid some of the survey fatigue that sometimes happens at every course level.

@msantos I think the answer to your question is yes :)

It really comes down to what you are doing and how you want to achieve your goals. Both routes are doable and fine, the nuances come down to case specifics. The last learning plan object has advantages for setting progress to other learning plans automatically as well.

Reply