Skip to main content
Question

Companion vs. Harmony AI Tutor: Is this the end of complex SCORM modules?

  • May 12, 2026
  • 3 replies
  • 26 views

Moshe.Machlav
Guide I
Forum|alt.badge.img+1

Hi everyone 👋

With the release of Harmony AI Tutor and Docebo Companion, I’ve been analyzing how this changes our L&D architecture. It essentially splits our focus into two tracks:

  1. Docebo Companion (Performance Support): Bringing targeted LMS content directly to specific URLs (CRM, Ticketing, etc.) via the browser extension.

  2. Harmony AI Tutor (Deep Learning): The side-panel AI inside the course player, strictly querying our internal content for summaries and quizzes.

I put together a full breakdown of the differences, including how they compare technically and where I see platforms like Spekit or Sana fitting in comparison.

(You can read my full deep dive here: https://digitalstep.biz/guides/docebo-harmony-ai-tutor-vs-companion/)

My question for the community: If Harmony AI Tutor can read, summarize, and test learners on any topic, do we still need to build complex, heavily designed SCORM/Rise modules?

Are you shifting your content strategy towards simpler, text-rich files (PDFs, Knowledge Base articles) purely optimized for "AI readability"?

Would love to hear how this impacts your course authoring workflows moving forward!

3 replies

dklinger
Hero III
Forum|alt.badge.img+11
  • Hero III
  • May 12, 2026

Hi ​@Moshe.Machlav,

This is a heavily nuanced discussion.

Can we start with the word security and work our way outward?

Seeing Harmony as a part of the learning moment sounds good. Exposing documentation and Docebo authoring to enhance the outcome of Harmony does build an AI knowledge base for us in the best case.

And then the word security becomes the challenge - who else has access to the repository that we build? Where is that stored? Is it purely at the data centers that our instance is at? Or is it somewhere else if Harmony calls on an LLM to execute a routine (sorry - I am lacking the terminology).

IMHO - I think building SCORM courses are going to hang around as long as portability is still an issue w content migration. And even though outputs from learning systems are available, the best bet for compatibility still remains authoring in SCORM.

Further, I believe? Doing the investment into AI knowledge bases face the same exact problem as with SCORM courses - the portability out of harmony in a manner that we would be able to transfer all of the knowledge to another LMS is just not there. Unless buildouts of Learning instances also are asking teams as part of onboarding to upload documentation as part of their approach. It would be better if we could move a flavor of AI from one platform to the next at the right time.


Moshe.Machlav
Guide I
Forum|alt.badge.img+1

Thank you ​@dklinger for such a thoughtful response! You hit on the two most critical hurdles L&D faces right now: Security and Portability.

On the Security front: You are 100% right that Infosec is usually the first battle. From my understanding of Docebo's architecture, Harmony operates within a secure, sandboxed Azure OpenAI environment. They do not use our tenant's proprietary data to train public LLMs. It remains within our instance's boundary. But getting stakeholders to trust that is definitely a process!

On Portability & SCORM: This is where the paradigm shift gets really fascinating. I actually believe that optimizing for "AI-readable" content makes us more portable, not less.

While a SCORM file is technically portable between platforms, the content inside it is a compiled "black box." An AI cannot easily read the text trapped inside a SCORM interactive slide. However, if your learning architecture shifts toward clean, well-structured text, Markdown, or standard PDFs, you aren't locked into Docebo at all.

If you ever migrate to another LMS (like Sana, Workday, or a future platform), you simply take your repository of clean documents with you. You aren't trying to export Harmony’s "AI brain"—you are exporting a perfectly clean data layer that the new platform's AI can ingest and understand instantly.

In that sense, moving away from heavily authored SCORM actually future-proofs your content for any AI system, not just Docebo's.

Does looking at the "clean data layer" approach make the vendor lock-in feel a bit less risky to you? I'd love to hear your thoughts on this!


dklinger
Hero III
Forum|alt.badge.img+11
  • Hero III
  • May 12, 2026

The clean data layer approach sounds elegant.

But what we are finding as we configure different AI tools? Is that certain documents are used to constrain the outcomes and refine a prompt. In essence it constructs a guardrail. Dumping loads of documentation as a clean layer - becomes simply not enough. No one wants to be in the business of retraining the next AI set of tools. Analogously? No one wants to be in the business of reconstructing those guardrails. Ergo - we have a content dump. Somewhat different than an intranet…maybe.
Take for example an AI Coach constructed in Docebo today. Nope not portable. Actually any AI Coach is not portable unless you iframe in the tool to somewhere else. Our configurations may be portable through strong documentation, but all AI Coach configuration tools do not speak a common language yet. Does this sound familiar? It is literally the same problem that SCORM solves for - portability. 
Purely an opinion - (and I have very little of a clue about what I am talking about here) we need to be considering how to do some fancy things with the MCP Server and tying in our local business AI builds that are already walking our documentation on our intranets, share point sites and safe repositories. At a meetup I heard of some really elegant things that are being done with corporate localized AI builds with Docebo in the mix. That - my dear colleague - is a much better spend in cycles. Using Harmony may fit someday with the MCP Server. But for now? I think what seems like a clean data layer is much more work than not to curate from one learning system to the next.