Skip to main content

Hello!

My company has this year looked into monitoring how much time, each month, is being spent by users in our platform. Identifying just who has logged on isn’t much of an issue, but we need to figure out how much time, total, our users are spending interacting with training materials in order to provide better optics on the ROI over our old training site. What we have done thus far is not providing accurate results, however, and I am hoping one of you kind folks might have a better idea of where and how we can get this information!

 

What we are doing now:

Currently we are utilizing the Users - Courses report to pull data, with the following filter:

• Enrollment Date tAfter / Absolute]- 11/3/2023

• Usage Statistics - Training Material Time (sec)

While this will reflect accurately all users who were enrolled after this date it will not reflect users who have passed the course or even briefly touched the training materials this month but were enrolled into it 4 months ago. 

 

Attempted Solution:

I attempted a different pull, utilizing the same report, but including the last access date as a reference point. I also used the following filter:

• All Users

This would pull every user’s interaction with the site. I then used a Dated If formula in order to identify who has accessed something in the last 30 days, and deleted all others, hoping to reveal only those users who have actively signed in and actively used a training material in some point within the last 30 days. However, this has a flaw where it will show all of their training time, even if it’s older than 30 days. 

 

Final Thoughts:

Effectively, I have been unable to locate a method to report only on what has been interacted within the last 30 days that also reflects accurately the total numbers of hours spent on training materials within the same time frame. 

 

Any ideas on how this could be accomplished, even if it’s API use, would be appreciated. 

Have you tried the users - course report in “New Reports” utilizing the course enrollment fields? 
The course first/last access dates are related to a specific course and could include the training material time if you select that as well. I think this would work best with your approach to bring the data into Excel and filter out what you don’t need. 

 

 


Please be aware when using “Training Material Time” that it records the time the browser is open on the training material. So, if someone starts a training material, pauses to go get lunch, then goes to a meeting, then comes back to the course all that time will be reported as time in training material. 


Subscribing to this because my experience has been the opposite of what Diane mentioned above: we routinely have people completing, for example, 45-minute courses with only a few seconds of training material time being recorded. I’ve just resigned myself to these numbers being complete garbage, now and forever, but if I’m wrong about that I’ll be delighted to learn how to ensure more accurate tracking.


@Matthew.Shumway this is a common use case for implementing a data warehouse-like solution.

The information you’re pulling from Docebo is a snapshot of the reality, that was true when you pulled the data - e.g. on Dec/1 my total learning time was 145 minutes. It doesn’t tell you how this breaks down into specific days and months when I was accessing learning (was it 145 minutes in Nov, or maybe 100 min in November and 45 in October, or ...).

To understand how much time I spent learning in a specific time window, e.g. November, you’d need to compare the above number with my total learning time that you pulled at the beginning of the period (on Nov/1).

To achieve this you’ll need to make regular snapshots and then calculate the time difference between the points in time.

 

Said that, I also observed what @Ian and @dianex.gomez mentioned, that the reported learning time is highly inaccurate (both ways), so creating a sophisticated analytics solution to analyze unreliable data, may not be worth the effort. 

 

To track the content consumption on the platform I was using the course-level last access date (again captured every month) - this provided me information on who accessed learning content over the last month, as well as helped to spot courses that were not getting any attention for a longer period.


@Ian That sounds like a major issue. I know the time in the personal summary can be unreliable, and that learning time can be overestimated, but I had no idea that it could be underestimated as well. Occasionally we’ll get a user who completes a 2 hour course in a matter of minutes and I had always assumed they were gaming the system by fast forwarding through all the videos. From what you say, it sounds like that might not be the case.


You prompted me to take a more holistic look at our figures for completed courses, @Daniel, and overall, it actually does skew more towards overestimates. Which is to say, when I looked at all the data, Diane’s comment holds true for us as well.

It’s not that we don’t have exceptions, but by describing the under-reporting as “routine” earlier, I realized I’ve likely been over-indexing on cases where the user is experiencing problems with completion tracking (I haven’t normally been monitoring it otherwise), and I guess in such cases it’s not surprising that the time spent would be under-reported as well.

That said, there are also examples of people completing the course without issues, and recording times that are faster than should even be possible, given some of the obstacles the course puts in front of them (un-skippable videos, for example).

And while this is certainly not great, I’ve actually not regarded it as a major issue, at least while we're still struggling to get 100% of completions to register properly in the first place. Ultimately, that’s the thing that end users notice - being reminded to complete a training they already completed, or resuming a training and finding that their progress was lost - rather than the time spent being recorded incorrectly.


@Ian if you are struggling with courses completing correctly, is it possible that admins are marking courses complete to solve the user issue? If so, you will see underreported time. You may even see cases where the course is complete, but the training materials are not started or in progress. The good news is if there is some under and some over reporting averages should be viable as long as you can throw out outliers. 


@Ian Thanks for the clarification.  As you say, it makes sense that learning time might be underreported for courses with completion tracking issues. I was wondering, are the tracking issues limited to a specific type of training material? Scorm files for example? Our courses consist of videos, tests, and surveys, and we haven't had any issues with users completing these materials and not being recognized by the system.


@Ian if you are struggling with courses completing correctly, is it possible that admins are marking courses complete to solve the user issue? If so, you will see underreported time. You may even see cases where the course is complete, but the training materials are not started or in progress. The good news is if there is some under and some over reporting averages should be viable as long as you can throw out outliers. 

 

I’ve actually found that mobile users, those who are not using the Go.Learn app, frequently do not have their course training tracked correctly. In fact, large numbers of training materials do not retain any interaction memory. This is often caused by many mobile devices blocking tracking cookies by default, or limiting it in some capacity, leading to inaccurate results. 

The intention here isn’t to get pinpoint accuracy (though this most definitely would be welcome), but to generally gauge engagement in general. Without the ability to even get limited data on how long someone has spent on the platform in the last month it does leave us with little in the way of tracking engagement on that platform.

As to your first question, yes, that is what we are using currently, the trouble being that it only reports on users who were enrolled that month and doesn’t cover data from users who are interacting with the training materials routinely after completion in order to review that training. 


@Ian if you are struggling with courses completing correctly, is it possible that admins are marking courses complete to solve the user issue? If so, you will see underreported time. You may even see cases where the course is complete, but the training materials are not started or in progress. The good news is if there is some under and some over reporting averages should be viable as long as you can throw out outliers. 

All true, @dianex.gomez, and there are certainly some cases like this, but for the courses I was checking there are a very limited number of people who can mark them complete, and there’s a paper trail involved for requesting such intervention. It doesn’t explain all cases for us. But for my part I’m not agonizing over this at least; again, it’s further down the list of priorities than preventing the completion tracking failures in the first place.

Daniel wrote:

I was wondering, are the tracking issues limited to a specific type of training material? Scorm files for example? Our courses consist of videos, tests, and surveys, and we haven't had any issues with users completing these materials and not being recognized by the system.

Very fair point, @Daniel. Like you, I’ve not noticed these issues with other videos, tests or surveys: we use SCORM pretty much exclusively. We have a number of SCORM 1.2 packages in the system and we’re going to try using SCORM 2004 exclusively going forward. Some of our courses are a bit long (<60 minutes), so I hope the higher limit on cmi.suspend_data will help. We also recently increased our session time limit to three hours, but now I’m starting to get a bit off-topic...


Thanks @Ian. We’re hoping to add Scorm packages in the future, so this information is useful for us.


@Daniel We’ve been having problems with users completing courses in a few seconds also. Docebo Help did a check and found that if there is a course that uses HTML pages, even though there may be 20 minutes of embedded content on a page and the person engages for 20 minutes on that page, for example, Docebo marks the time spent as 1 second per page. There doesn’t seem to be a workaround for that unless we create that content in a Scorm file.


I don’t think there will ever be a solution to get really accurate learning data in terms of time spent on learning activities. Sure, some isses are related to the format of the material, such as videos, slides etc., but then there’s always the human factor to it as well. We had exactly the same problems as most people have mentioned here. Trainings being completed after a few seconds or the opposite, people spending dozens of hours on a training that should have taken them 30 minutes. Some issues can be fixed with work arounds and settings on training materials, but we decided to fix the bigger issues outside of the platform in our reporting. All our trainings have an estimated average training time. This is an estimation and most completions should move around these estimates. Hence, we apply a tolerance to this estimate to include values that are above or below the estimate but still within a certain range, and anything way outside this tolerance gets a quick review and usually the estimate assigned. This implies some work and is probably not ideal for everyone, but for us it works since we can’t rely on the data from the platform. Even SCORMs are not a 100% solution. We have learners, and some of you might relate, who start an elearning and then in the middle of it go for lunch, leaving it open. This is just an example. We have even already changed the default value for the automatic logout for inactivity to fix some of it, but in the end, there’s still some cases. 


Reply