Construction of a quiz ? check


Hello , in the construction of a quiz , -it is possibe to remove the check button after each question?

fnoks's picture


Currently there is no such option in the editor, since check is used on many question types to give feedback.

The "check" button shows the answer to the question.   Anyone trying to simulate a real exam needs to be able to disable this.  If you take any test in real life (e.g. PMP, NCLEX, LSAT) at any testing center, you are not allowed to check your solutions as you go along.  They are either revealed at the end of the exam or not at all. 

I do see the benefit of having the "check" button in the quizzes.  However, to make it fully useful for anyone trying to simulate any real exam, there must be an option to disable this. 

What are the options to get this done through github or otherwise?  Can I pay someone to add the option to disable it?

otacke's picture

This could become a big sweep for Question Set, right?

  1. We have a feature request to optionally change the feedback animation (could as well mean removing it including the feedback text).
  2. We have a feature request to remove the retry option for the complete set of questions.
  3. We now have a feature request to remove the instant feedback.

If 1) is interpreted as "removing", this would merely mean to add an option in semantics.json that can be used to toggle the visibility of the feedback object [edit: seems there already was a request for this, and there's even a highly prioritized task in Jira]. A more elaborated version might offer at least one alternative progress bar CSS design to choose from, e.g. without the star and without rounded edges.

2) basically means the same as in 1) for hiding the "retry" button at the end, and this change would allow one-time-only tests.

3) would require some more work, since it involves touching several content libraries, but I guess the override principle for "show solution" and "retry" might be used to trigger 1) in each question inside a question set and also deactivate other feedback [edit: seems there already was a request for this, and there's even a highly prioritized task in Jira]. However, there are probably some other things to be considered, e.g. the label "check" for the button might not make sense anymore and would have to be changed manually (or automatically using a new label), the "show solution" button might be disabled optionally, etc.

Does this sound plausible and sensible?

For #3 (trying to hide the "check" button on quiz sets), I think all we need is to add it as the 3rd option for button settings ( or if the user deletes the text the button should disappear (  I'm not sure what is required on the backend, but if my webmaster can provide some coding let me know and I can get her involved. 

Kiosa Coup's picture

In case you're not getting all the positive feedback that you so richly deserve:

Us high-frequency low-stakes quizzing folk LOVE the Question Set, Check & Retry options.

Thank you very much.

Our students love that Gold Star too! Especially (weirdly?) after they've had to engage with a detected error and adjust their answer - having to hesitate, and venture even a few tries, seems to make that Gold Star even shinier.

See Moser Et Al (2011) for more on Adaptive Post-error Adjustments

Apparently it's much more gratifying than just getting the correct answer with the first try.

If you're going under the hood, how difficult would it be to give us the option to vary the "you did very well" text?

We are using the WordPress version.



tim's picture

It depends what you wish to vary it on :) 

Would you like it to cycle through different statements when students do 'very well'?


Kiosa Coup's picture

We are interested in the topic, and mindful that it may not be that simple.

What do others think?

We are not using H5P to simulate high stakes testing -- we are using it strictly for high-frequency low-stakes retrieval testing (to strengthen semantic encoding (as a prequel to elaboration and synthesis)). We like the instant feedback because, in the United States in particular, a game show mentality has set in and we need to collectively learn how to slow down and think before we answer - so we need to run interference on the habit of jumping for an answer too quickly by making engaging with post error adjustment strategies a pain-free no brainer.

In light of everything else, "You did very well" seems to be doing very well.

When it comes to feature requests, I'll admit, I hesitate to single-handedly flag priorities - since we have only been working with the app for 2 months and beta testing for one. Before that we were mostly working with anki, with pit stops @ memrise, neither one of which seemed to be a negotiable option for our beta testers.


tim's picture

I'm curious about your project :) What were the main differences between the platforms and why have you landed on H5P?


Kiosa Coup's picture

Because it's built on a distributed model. Mind you, PHP is SLOOOOOOOOOOOOOOOOW. But once you have those quizlets up and running, it's Autobahn time. You guys are literally half way between Memrise and Anki (both really great projects). It's almost funny. They're both siloed at either end of the spectrum and it's not that there isn't a place for that. It's just not as wide open a place.

Plus: it's not clear that Wozniak's Algorithm is the holy grail of secret sauces.

I found you through the Moodle seminar. Someone said something smart in the forum and I clicked through to their profile. Moodle runs that course twice a year, it's a very good course, and every year vets show up. Last January at least one of them mentioned you in their profile.

tim's picture

Great, thanks for the feedback! I'm a heavy user of Memrise and Anki myself. I do like their algorithmic approach to learning (even if faith in it may be misplaced), but it's quite refreshing seeing content created in a more free form way using H5P :) 

Let's make sure everyone gets a chance to use this Gutenberg press of interactive content.