Question Set xAPI statements

xAPI coverage in Question Set seems a bit screwed up / meaningless currently.


  • If there are 4 quesitons, each interaction will generate 4 'interacted' verb statements
  • If you tell it not to Show solutions, it still does.
  • If you click Show solutions, it sends the 'answered' verb statement but "response" is empty no matter what is selected
  • Completed statement at the end doesn't have any info about what was in it, just that it was completed

I'm using the latest version of the H5P library (0929) and Drupal (1.18)

falcon's picture

That is true, or at least it is more meaningless than screwed up I would think. I don't know of anything that is wrong, the statement from H5P just don't include all possible data yet, and we could have more statements as well. Current implementation can be used to track completion, scoring and more, but doesn't give you the full picture. We don't have full xAPI support for all content types yet. Regarding your specific comments:

  • interacted is useful in many cases. The user has done something with one of the content types. Used by both Question Set and Course Presentation to track which questions a user has done something with. Will be used by the Drupal Quiz module as well to track results from questions.
  • Didn't understand this not show solutions thing. Has this god anything to do with xAPI?
  • Answered should be sent each time a question is checked. Completed when the entire question set is completed and you're at the end screen. Seems to work correctly at the example at least? Where do you press "Show solution" and get an answered statement? The correct answer and user's answer is part of the answered statements from Fill in, MultipleChoice and Drag and Drop. The other question types don't have them yet.
  • What extra information are you expecting here? I don't know of any best practice for including for instance all questions and alternatives in a statement for a question set, but it might be that there is a common practice that I don't know of.

I don't think that the randomizer works correctly in the Question bank. I set up a test example, StratQbank. It seems to work well in that I can select any number in a subset of questions from the bank. I love this idea. But when I play the widget, it always seems to select the same subset. I had hoped that these would randomize again when you click the Retry button - they don't. But I seem to get the same set each time I try, even when I relaunch the widget. Do you need to have a fresh new browser session for the randomizer to kick in? I hope not.

I'm experiencing this same problem (using the H5P labs site). Is this a caching issue with WordPress or something? I'm not even seeing the subset of questions change after a page refresh or browser cache clearing. I need to have a new subset presented each time the set is retried, as well. 

timothylim23's picture

Thanks for noticing this. The subsets weren't originally designed to randomize on retries. I'm working on a fix for this now.

timothylim23's picture

Hi, I've had a think about randomizing subsets and wanted your input before I proceed. 

The one point against having randomized subsets on retry is that it may be confusing for the learner. As a learner, I would press the Retry button because I wish to improve and if I see new questions every time I retry I may never get that opportunity. This is particularly true with, say, a large question bank with 50 questions and a subset of 5 questions. It may take the learner many retries to see the same question. 

Perhaps there is a middle ground with a new subset only being created on a page refresh at the beginning of the question set.

What are your thoughts?

Hmm, that's a good point. And I think that is a good suggestion: not to re-randomize when [Retry] button is clicked, but yes to re-randomize if page is refreshed. That would also make a good compromise for testing. Additionally, in our own environment (OpenLabyrinth, where we also have H5P embedded), we generally discourage students from using the Back Button or Refresh because it can spoil the tracking of other internal scores. This would provide one more incentive. 

Well, our plan is to use Question Sets for little mastery-based learning quizzes. We do not want them seeing the exact same quiz each time. This is common in mastery-based assessment where you provide the learner with parallel versions of a quiz when they have to retry it. We could also let the learner view the solutions to the quiz they just attempted before retrying it. I would not want to require the learner to do a page refresh to get a new version of the quiz (because that would also be pretty confusing to them and provide an awkward user experience).

I think any potential confusion could be mitigated with instructions at the beginning of the quiz (e.g., "If you do not earn a passing score, you will need to retry the quiz. You will not see the exact same questions each time you try the quiz." or something like that). 

Maybe make it a configurable option? Allow the pool to be refreshed upon retry or only on page refresh?

tim's picture

Thanks for the response, it makes compelte sense.

I'll go ahead and develop an option to enable the randomization of the question pool on retry. The content creator will be able to specify in the description the message you posted above:

"If you do not earn a passing score, you will need to retry the quiz. You will not see the exact same questions each time you try the quiz."

timothylim23's picture

Hi, would you mind downloading and attaching the H5P to this conversation and I can have a go at testing it?

I would be happy to share. But I don't seem to have the right privileges to do this. On the H5P Labs site, the widget is here: and the shortcode is [h5p id="162"]

However, the Download link that I see on my own H5P sites is not visible on the Labs site, so I probably don't have the right privileges to do this. 


timothylim23's picture

I have privileges to labs.h5p and I'll check it out there. Thanks :)