Websites for crowdsourcing cultural heritage support contribution quality by minimizing user error.
Explanation: Where feasible, the website should alert the user to actual errors, such as incorrect data formatting, required data entry fields that are incomplete, or spelling errors. Websites that assign the same task to multiple users, and rely on computational algorithms for quality control, should explain this to users to allay concerns about potential error.
Users should be able to flag potential error, which may be due to the low standard of the digitized asset being worked on, or insufficient knowledge or skill to complete the task with confidence (this may be of limited relevance when users are recording/creating content, providing contextual information, or correcting/modifying content).
Benefits: Minimizing task error reduces the necessity for editorial intervention by the project team, and contributors feel confident about the contributions they submit.
Examples of compliance with this principle:
- Diagnostic tasks or sandboxes for new contributors.
- Tools for standardization such as calendars for date formatting, automated capitalization, and authority control/controlled vocabulary in the form of drop-down lists or predictive text.
- Automatic reminders.
- Informative error messages.
- Summaries or previews of contributions that encourage review prior to submission.
- Users can skip or flag difficult or ambiguous assets.
- Users can navigate to a previous step in the task for editing purposes.