Skip to main content

Hello Community,

Does anyone here have the experience of using the new product called Learn Data? Basically, it is to connect all relevant data in the learning platform, to the organization’s data warehouse, which then can help the organization to create and manage its own learning dashboard with in-house tools e.g PBI.

  1. What’s your experience with this product so far?
  2. Do you need a sand box in order for it to function properly? We were told that having sand box is compulsory - for sand box, we have been using the platform since 2017 and can survive without it.

Look forward to some response!

Good morning,

I believe there is a tool that  is called Docebo Connect that could do what you are suggesting.

But with API calls, in theory you can do this pretty directly with a developer depending on how frequent you make calls to it.


My understanding is that Learn Data gives you a replicated database which allows unlimited API calls - but only for pulling data OUT, not for putting data IN.

 

Basically, unless you need more than 1000 API calls per hour, you’re better off writing API calls or hiring someone to do that. You can do it yourself for free, or you can pay for Docebo Connect, which is basically pre-written API calls.


I am getting schooled...thank you.


Learn Data is a Snowflake to Snowflake ELT tool. If your data warehouse is Snowflake this is great news for you. It eliminates the need for API calls for you to move data to your data warehouse, given there is a limitation this could be very powerful if you have a lot of other integrations and automation using API on your system. Once you have the data in your data warehouse you can use your chosen BI tools, combine learning data with business data for impact metrics, and much more. 


My understanding is that Learn Data gives you a replicated database which allows unlimited API calls - but only for pulling data OUT, not for putting data IN.

 

Basically, unless you need more than 1000 API calls per hour, you’re better off writing API calls or hiring someone to do that. You can do it yourself for free, or you can pay for Docebo Connect, which is basically pre-written API calls.

Thank you for the sharing! 


Learn Data is a Snowflake to Snowflake ELT tool. If your data warehouse is Snowflake this is great news for you. It eliminates the need for API calls for you to move data to your data warehouse, given there is a limitation this could be very powerful if you have a lot of other integrations and automation using API on your system. Once you have the data in your data warehouse you can use your chosen BI tools, combine learning data with business data for impact metrics, and much more. 

This is helpful, thank you for sharing.


@dianex.gomez nailed it, we are using Learn Data for exactly this purpose and not going through direct API calls on the platform due to the size of our user base. 1,000 calls per hour is sufficient for most use cases but if you’re in the millions of users with multiple domains the limitations will bring the platform to a halt, additionally the reporting options within the platform do not provide all the data available in Snowflake. 


@dianex.gomez nailed it, we are using Learn Data for exactly this purpose and not going through direct API calls on the platform due to the size of our user base. 1,000 calls per hour is sufficient for most use cases but if you’re in the millions of users with multiple domains the limitations will bring the platform to a halt, additionally the reporting options within the platform do not provide all the data available in Snowflake. 

Hi Jenna! Thanks for sharing!

Could you please let me know what your experience has been so far with Learn Data? we are studying the possibility of having this implemented but I am not sure if the solution itself is what we are looking for.

Thank you!


@roni.karbuzrosas It has been slow going on our end creating the dashboards and data stories that we intend to use Learn Data for, however, I can say that the set up to import the data from Docebo’s Snowflake to our Snowflake was very straightforward and we received good support from Docebo any time we had questions. The data is imported in predefined tables, initially there were about 120 or so tables and we were provided a Data Dictionary of what is included in each table but the dictionary left much to be desired, the fields are somewhat inconsistent so it took a good understanding of of the LMS data to help our teams figure out which tables to join together to get what we needed. Since we have multiple domains and the data is all imported together we needed to be able to separate by domain which took joining multiple tables. There are also a lot of fields that use ID’s, so you’re not necessarily going to see a learning plan name, but rather the numeric ID. Again, not a deal breaker, just takes cleaning up to replace those fields with something that makes sense to the end user. 

There are updates coming though, starting 10/30/23, that adds another 400 tables and improves what is available in the legacy tables as well, provides delta updates (currently we have to import the entire database everyday but now we should just get the changes from day to day), and more frequent refreshes to be every couple hours rather than once daily. I haven’t had a chance to actually see the updates yet but I’m hopeful that it will be a solid improvement! 

I hope that helps somewhat, I’ll try to provide another update once we get our dashboards built out as to how robust the data truly is and the extent of the lift.


@roni.karbuzrosas It has been slow going on our end creating the dashboards and data stories that we intend to use Learn Data for, however, I can say that the set up to import the data from Docebo’s Snowflake to our Snowflake was very straightforward and we received good support from Docebo any time we had questions. The data is imported in predefined tables, initially there were about 120 or so tables and we were provided a Data Dictionary of what is included in each table but the dictionary left much to be desired, the fields are somewhat inconsistent so it took a good understanding of of the LMS data to help our teams figure out which tables to join together to get what we needed. Since we have multiple domains and the data is all imported together we needed to be able to separate by domain which took joining multiple tables. There are also a lot of fields that use ID’s, so you’re not necessarily going to see a learning plan name, but rather the numeric ID. Again, not a deal breaker, just takes cleaning up to replace those fields with something that makes sense to the end user. 

There are updates coming though, starting 10/30/23, that adds another 400 tables and improves what is available in the legacy tables as well, provides delta updates (currently we have to import the entire database everyday but now we should just get the changes from day to day), and more frequent refreshes to be every couple hours rather than once daily. I haven’t had a chance to actually see the updates yet but I’m hopeful that it will be a solid improvement! 

I hope that helps somewhat, I’ll try to provide another update once we get our dashboards built out as to how robust the data truly is and the extent of the lift.

Hi Jenna, thank you very much for the helpful and insightful information!

 

 


Can we use Learn data to extract course and session related data including course additional fields too? it can fetch live data?


Reply