Problem with extracting interactive course summary slide score

Miki's picture


I am using Wordpress, h5p content is mostly viewed trough the small devices. Latest content and plugin versions.

Is there is a way through the API to get either:
- The final score (the one that shows on the summary slide) at any given moment, so doesn't matter when the user closes the H5P, we will have the last given final score.
- The number of activities and total questions (and therefore points to be given) for the whole H5P file.
Our developers report:

"H5P sends the data only when browsed. It means that if the user browses slide 1 then only it sends the data to server.
Consider there are 10 slides for the content of which, 3 are activities. We get data for all browsed slides but if user performs only first activity and then closes the content, then we do not have any means to detect that there are other 2 activities remaining. This is causing the calculations of the score wrong.
Below is the calculation being performed. Consider user has scored 66% on first slide and has performed only 1 activity which is on that first slide.
all_activites_count = All activities performed by user (1 as recorded by server)
received_score = Sum of all score achieved by user for all activities (66%)
percentage = received_score / all_activites_count (66%/1) = 66%
We need some means to check the number of questions on other activities that are not browsed."
There has to be an easier way for this. Hopefully it is possible to just get that final score that shows up on the summary slide at any moment (or any time the user changes from one slide to the next, as the developers said).
Thank you!


tim's picture

Hi Miki, 

You can use the getSlideScores() function in H5P.CoursePresentation to determine the full score of the CP.

In Content Types that support getXAPIData() this function can be used, it is not implemented for CP yet. In the meantime, you may have to craft a custom solution. 

Feel free to have a look at the following documentation for additional information on how scores are passed between content types:



Miki's picture

Hi again,

Thank you tim for your answer. We have done some progress with this but we still have some issues. We are not sure if this is a problem on our side at this point. This is what the devs have found:

Taking this activity as example:

There is something wrong or something that we don't know about the score calculation on H5P content. Below are the details for the lesson in this ticket.

1. On first activity, response of the content is {"min"=>0, "max"=>6, "raw"=>4, "scaled"=>0.6667} means user has given 4 correct answers from 6. With calculates percentage to 66.6%

2. On next activity, response is {"min"=>0, "max"=>1, "raw"=>0, "scaled"=>0} which calculates percentage to 0%

2. On last activity, response is {"min"=>0, "max"=>1, "raw"=>0, "scaled"=>0} which calculates percentage to 0%

Now if we calculate average from these scores, we get (66+0+0) / 3 which is 22%. But the summary slide at the end is showing 29%. Are there any other considerations while calculating the scores?

tim's picture

Hi Miki, 

thanks for your thorough response. I'll be working on the summary slide today and will get back to you. 

tim's picture

Hi Mikki, 

This is because the summary slide in course presentation calculates the total score for each question as opposed to averaging their percentage results:

I can see how this can be quite confusing. I've made an issue to improve it here:

otacke's picture

Personally, I think the current implementation makes much more sense. If you compute the average result based on the percentage of each question, then you neglect the different levels of "difficulty" that are reflected by the maximum score.

If you want the percentage based on the number of completed tasks instead, you can check "Give one point for the whole task" in the tasks' settings. However, in this case you will have only results of 0 (0 %) or 1 (100 %) for each task. Not quite what you are looking for.

Anyway, I am rather wondering why in the mentioned example the average would be 29 %.

(4 + 0 + 0) / (6 + 1 + 1) * 100 = 50

tim's picture

Yes, the math doesn't check out. Mikki, do you mind attaching the h5p to this conversation so I can download it and debug it directly?

Miki's picture

Hi, here is attached the h5p file.

tim's picture

Hi Miki,

I had a look at the h5p you uploaded and figured out what is happening. 

For activity 2. that you pointed out above, the response actually refers to a sub question within the Single Choice Set. The entire Single Choice Set actually has a max score of 7 and you should try and get the xapi response for the entire Single Choice Set. 

(4 + 0 + 0) / (6 + 7 + 1) = 30% ish. Rounding earlier in the code causes it to be 29%. 

Miki's picture

Thank you. That clarifies everything. I also agree with the current way of calculating the final score, in order to preserve the weight or difficulty of each activity into the final result.

We will update our code accordingly. Thank you both for your help.

Miki's picture

Hi Tim, thank you for looking into it. We are looking forward into update regarding this.