Interactive Video - inconsistent trigger of completion event

Michael Dajewski's picture

The Interactive Video should always trigger completion event. 

Once ANY of H5P interaction is capable of triggering the 'completion' event it should trigger the event regardless it's settings. 
In case of IV the completion event is not triggered if editor (for example) decides to remove the 'submit screen'. 
Unfortunately this is not documented clearly anywhere, nor the fact that IV will not trigger completion event if there are no interactions. 
BTW.: Who wants to create interactive video without interactions. Believe or not - there are some users that go that route too. 

The main reason for reporting this issue as a bug is - the application which declares listener for xAPI events may wait forever for the completion event. 
There is no common method that would detect upon loading the H5P interaction if the interaction will trigger completion event. 

I have been dwelling on submitting above as a bug for long time. 
I did look at a lot of posts on how to examin the IV upon it is being loaded and they do give great information and prove the H5P team has great knowledge. 
However, this is all just for IV and does not apply to any other H5P content that has xAPI implemented. 
Use of H5P interactions should not require LRS/LMS to examine in retrospect the event statements in addition to do it on per H5P content type. 
This breaks the principal for making the things working together. 

Do not want to open Pandora's Box again. 
But I think that IV which is for long time listed as 'Featured' on examples and downloads page should be more 'consistant'. 


Content types: 
BV52's picture

Hi Michael,

Thank you for posting your obeservation and suggestion. I will inform the H5P core team regarding this.


Michael Dajewski's picture

Hi BV, 
It is not observation and/or suggestion. This is a bug !!! 
If it would be suggestion, I would post it at 'feature requests' forum. 
Best regards, 

BV52's picture

Hi Michael,

I agree with your point above but I think we just have different definitions of what a bug is (hence I did not transfer this post to a different forum). Anyway I've already reached out to the core team regarding your report and we just need to wait for their feedback regarding this.


falcon's picture

Hi MichaelD,

Thank you for reporting. The Core Team will be working on ways to make completion and limiting number of attempts easy and consistent as well as making it easy to detect what content is scoreable and not much better next year.