SCORM Mystery - Articulate Rise / Storyline 360 resolved

  • 4 October 2023
  • 9 replies
  • 436 views

Userlevel 7
Badge +6

For folks that went to Inspire 2023 there was a session about SCORM, xAPI and Content practices in general.

It was pretty good - we were walk through best practices of generating content and how to work with it.

At one point I raised my hand and asked how you get that fancy question level type of CMI interactions into the system where it is a human readable format for consumption (essentially a training material question/answer level report). And @KMallette shouted out from the back of the room SCORM 2004, v3. She stopped over and we met for a first time in person 🙂. I expressed it has been eluding me since we starting working with Docebo. We chatted about what could be the problem - and the conversation did not leave me.

I am here to say that @KMallette is right. But there are some things to take into consideration to trigger that report properly.

First and foremost, where I was going wrong? Do not set a trigger to complete your course. You want the completion criteria to be a person passing a quiz.

Now in practice I began doing that maneuver years ago, but I wished someone walked me through it or I looked into it a little more before adopting the approach. The short share is that Articulate and Rise had to both live through a transition out of Flash and fully into HTML 5. There was a time (and I am sorry I can only speak to it anecdotally as I was troubleshooting less and leading more) where it was a bit rocky to count on those CMI interactions - none the less to get the completions successfully recorded in your LMS.

To help? I believe authoring systems like Articulate Storyline and others took a step back and said “hey if they want a single step to trigger a completion? Let's give it to them”.

Here is the thing - using that trigger? Transmits no deeper CMI interactions to the system...in essence? You have short cut all of the data your course could be collecting and told the LMS the person is done with that click….and that is about it. In fact? The CMI interaction “payload” that is submitted looks like SCORM 1.2 gibberish.

So - hopefully someone will read this and gain from this. If you want those interactions? Question and Answer level data? Then the moral of the story is Single Trigger = Bad. When the learner completes a quiz = good.

 

Learn from my blunders….

Are there caveats to the approach? Well - there are a few:

  1. SCORM can be flawed with how it suspends and picks up a score from a quiz that was suspended in the middle of the quiz.
  2. When you transmit more data via SCORM? It can be more verbose. And you can use that to your advantage (especially in Captivate where a publishing option is to send interaction communications or something like that). The flaw is that SCORM actually counts on constant communication between the SCO and the LMS - or the course can be left in a “bad state”.
  3. Depending on the user's connectivity (and work from home/remotely can severely challenge this), the user could be attempting to take the course where it doesn't stand to have a chance for that solid LMS to SCO communications.
  4. For long (duration) SCOs, you can find the LMS will go out of session before the course is triggering CMI interactions for its quiz.
  5. Editing a SCO that is “in flight” (live in production) can be hazardous to the health of the SCO and your learning campaign. Never mess with the structure of a SCO that is published without telling to yourself, why is it getting hot in here?. Structural details are details that negatively impact the xml manifest, including deleting expected slides, changing question and answer order or answer counts, etc. In Docebo, you can adopt hiding the older sco and importing a newer training material...but that approach has some caveats too (another article, another story).

So good luck with this - I hope someone gains from the stub and the conversation with another SCO enthusiast in the community.


9 replies

Userlevel 7
Badge +5

@dklinger AWESOME!!!!  Thank you for the follow up on the details.  And it was so lovely to get to meet you in person in Nashville.

Userlevel 4
Badge +1

I’m not an expert, but it seems to be possible to achieve this outcome via Storyline 360 and SCORM 1.2 as well – at least judging from something I’ve found with one of our courses.

Is this the type of report you were describing, @dklinger?

I’ve been asked to try and consolidate all our technical specs for the e-learning partners we use to author our trainings, and having read this post a while ago, I was somewhat surprised to find that this level of tracking had been achieved with a package that uses SCORM 1.2 (the excerpt below is from the imsmanifest.xml file):

<?xml version="1.0" encoding="utf-8"?>
<manifest [...]>
<metadata>
<schema>ADL SCORM</schema>
<schemaversion>1.2</schemaversion>

Furthermore, it was created using Storyline 360 (excerpt from the story.html file):

<!doctype html>
<html lang="en-US">
<head>
<meta charset="utf-8">
<!-- Created using Storyline 360 - http://www.articulate.com -->
<!-- version: 3.79.30834.0 -->

Personally, I’ve never used Storyline before, so I can’t really comment on how this was achieved in that software, beyond what I can see in “Advanced stats”:

Tagging @KMallette here as well:

  • Does this output align with what you’d expect to find from a SCORM 2004 v3 package, built as described in Storyline 360?
  • If so, is the advantage of exporting from Storyline as SCORM 2004 v3 that it makes this outcome easier or quicker to achieve?

Any other thoughts are certainly welcome! I’m technical enough to go poking around inside a SCORM package while understanding some of what I find inside, but a little knowledge is a dangerous thing… There’s a lot I have to learn about SCORM, so any insights are much appreciated.

Userlevel 7
Badge +5

@Ian I’m probably about as techy as you are on this, but one thing I think you’ll notice in your first screen shot, the actual question information doesn’t appear. We tracked this down to Storyline, which means that this format is a no-go for us. Courses built in Rise do present the actual question, and we can put a Storyline block into the RISE course ‘shell’ so that gets us the best of both worlds.

In the Scorm 1.2 courses that I’ve looked at, I’ve never seen responses come thru. I’d recommend that you check the specs of the two protocols, as I’m nearly certain 1.2 doesn’t even offer that support. Meaning, somebody did something in that course if the headers are to be believed.

I can’t provide any screenshots of my 1.2 courses at the moment given the outage/service interruption we’re experiencing, but I’ll try to add a couple of examples once the platform is back up.

Userlevel 7
Badge +5

@Ian UPDATE to my response earlier today

You pushed me to go look at this again, and I’m seeing the same you are.  Scorm 1.2 is providing information into the Additional Stats tab for a Storyline 360 course. I’m not sure if that tab was always there, or if it’s newish.

I know that when my team looked at this issue we were focused on what could be exported to Excel/cvs as our goal is some KPI dashboards with exam info as a baseline (how good do they perform their jobs vs. how good they did on the course exam). The issue of the question not appearing in the exports seems to be somewhat resolved when using a RISE quiz. When I export the training material information I get part of the question so it’s not a solid solution.

My team and I are working on this issue today, so if we discover more info, I’ll post again.

Userlevel 4
Badge +1

Thanks for much for the follow-up, @KMallette! This is really helpful. Good spot that while the answers came through, the question itself didn’t.

I will have to bring this up with our e-learning partners, some of whom have licences for multiple authoring tools (Storyline, Captivate and iSpring, if I recall correctly). We’re looking to establish a narrower spec that works best for us, and while we had been focusing primarily on questions of SCORM 1.2 vs 2004 v3 vs xAPI/TinCan, etc., I’m now wondering if we should go so far as to insist on a specific authoring tool being used as well.

I might also poke around Articulate’s “E-Learning Heroes” community to see if there are any insights there on this topic. If I find anything interesting or new, I’ll share that as well.

Userlevel 4
Badge +1

Ah, OK… So I’ve managed to pinpoint one significant difference between SCORM 1.2 and SCORM 2004 v3:

cmi.interactions.n.description (localized_string_type (SPM: 250), RW) Brief informative description of the interaction

This seems good for storing additional context about the question, if we don’t want to use the ID for some reason. It’s part of the 2004 spec, but it’s not included in 1.2. Both specs include cmi.interactions.n.id but with slightly different definitions:

SCORM 1.2:

cmi.interactions.n.id (CMIIdentifier, WO) Unique label for the interaction

SCORM 2004 v3:

cmi.interactions.n.id (long_identifier_type (SPM: 4000), RW) Unique label for the interaction

 

As far as I can tell, “SPM” stands for “Smallest Permitted Maximum”, i.e. I guess one can assume the character limit to be at least the SPM. “RW” is “read/write” and “WO” is “write-only”.

Certainly, this is making me think that SCORM 2004 v3 may be preferable to 1.2, regardless of the authoring tool. That, plus the much larger limit for cmi.suspend_data, of course… Albeit with the caveats in @dklinger’s original post.

 

Userlevel 7
Badge +5

@IanI neglected to mention the Description field.

We found that if we are creating the quiz in Storyline, we need to change to the FORM view (of the question) and add in the Question there. If we do this, then the question comes out in the cmi.interactions.n.description field as you described.

The Heroes community is where I got a lot of my understanding pre-Inspire 2023...several really good articles there. Scorm.com is another resource, run by Rustici Software.

As I mentioned on Friday, my team did meet. We ran thru our tests again (publishing both Storyline and Rise with Scorm 1.2 and Scorm 2004/3 so that we can compare the results of both Course Management > {coursename} > Enrollments > User Stats > Reports, and the Training Materials > Answer Responses reports.

We got different results from our earlier tests, which has led to a change in our requirements. Both Storyline and Rise produced exports that contained the question and responses (assuming that the storyline version used the FORM view to create the question).

 

Userlevel 7
Badge +6

@Ian - my apologies about not responding a little while back. The first report image you pulled up is what I was thinking about. Great to see it can work with SCORM 1.2.

Thanks for continuing the chat with @KMallette. I think between this article and the best practices one posted by @John - we are getting to a sweet spot. 

 

Userlevel 5
Badge +2

Appreciate the collaboration everyone!  @dklinger@KMallette,@Ian 

Really insightful info and happy to see the topic around SCORM BP’s being brought to light. Building and maintaining SCORM content is a very “to each their own” type of thing, however, hoping with your help and combined knowledge we can walk away with some common tenets to live by when creating, updating and managing the content in Docebo.

Reply