Tuesday, 11 November 2008

Per Question Reports

I received a good question from a reader:
Our school district uses Blackboard. We have recently been exploring putting more coursework online and have experimented with the Articulate line of products. Blackboard has done well for us although I am not pleased with their support of the SCORM standards.

I would like to be able to export SCORM test results by user/by question but all Blackboard seems to support is general pass/fail records. Now I can drill down to the individual results via the Bb gradebook but that just won't work for exams that are being giving to hundreds of students.

I have spent many hours researching this issue related to SCORM compliance and more specifically how to get the test results exported out of my LMS for more in depth analysis in Excel or perhaps some type of statistics package.

Would you be able to point in the right direction of a resource to help me get this done or please let me know if I am wasting my time. Perhaps the world has moved onto something else as most of the conversations online I am able to find regarding SCORM matters are 2-4 years old which in Net time as you know may as well be decades.
This brings up a bunch of initial thoughts. First, having worked on custom LMS, LCMS, Authoring projects and being pretty deeply familiar with SCORM, it's easy to blame the LMS for poor reporting on per-question reporting, but its very hard for an LMS to create generic reports that work well across all the variations that you run into. So, if the source of the statement "I'm not happy with their support for SCORM" is only the issues of reporting, that's a bit unfair.

I'd be curious what they are going to do with question level reporting. I've had a lot of clients initially say they want question level reporting. I always drill down on why they are going to do it. The most common answer is that they want to know if there are particular questions that lots of people are missing. But, in many cases (in corporate eLearning) there will be no attempt to go back and fix the course or update the test. So, if you really aren't going to do anything with the data, don't bother. It's a lot of work to get the data, look at it, decipher what it means and do something with it.

So let's get to the more helpful pieces of information. First, it appears that they are getting question level data in the LMS. That's good. And not always the case. In many cases, people will implement a SCORM course and/or LMS to only handle SCO / Module level reporting or even just course level reporting. But in this case, they have the data. Good job Articulate and Blackboard!

So, the easiest thing to do is to get a data extract from the LMS into a CSV that then you can manipulate in Excel. You should absolutely start with this to figure out what you really want to do with the data. You can do this in Blackboard by using the the "Download Results" with the "By Question and User" option. Here's a good plage showing this:

http://kb.blackboard.com/pages/viewpage.action?pageId=14551263

Once you've done this a few times and have figured out what you want your reports to do, then you can define a custom report using Crystal. Sometimes it's a bit hard to get at the data on a per-question basis, but I'm pretty sure there's a good view in Blackboard.

Let me say again, do not bother to try to define your custom report until you've done this manually across several courses/tests that will expose all the different ways that the data may come out. Then you will likely begin to appreciate the complexity. You may end up deciding to just look at the data via CSVs.

With all that, I'd love to hear your thoughts/ideas around:

* Do you use question level data / reports? If so, what do you do with it?
* Generic LMS reports that you have found useful for reporting at the question level?

No comments:

Post a Comment