Quality Assurance for Crowdsourced Content

With the mix of social elements in technology and product development, one of the major involvements from the community is crowdsourcing.

Crowdsourcing in software quality assurance is especially handy in leveraging a community’s skills and knowledge as end users, their domain expertise, and their testing background. 

While this continues to be used to ship quality products in the current day agile world, equally popular is to have the crowd create content across domains, regardless of the inherent risks it poses.

Typically, resident content teams have been an integral part of larger product teams, owning content creation due to their subject matter expertise. Online content continues to be very dynamic. It’s strategic in driving business and engages content teams across milestones and releases.

One of the more popular methods to create content in recent times has been to use the crowd. Such crowdsourced content tends to be more comprehensive and diverse. An early and successful example of crowdsourced content with which we are all familiar is Wikipedia, and Twitter is being increasingly used for assimilating the crowd’s inputs on varied topics.

While all of these contribute to the content’s richness, how do you verify the authenticity of crowd generated content especially when it is to be commercially used? Below are the core approaches and best practices when generating and verifying crowdsourced content.

Understand where to use crowdsourced content. Use the crowd in areas where content is difficult to create internally, a core subject matter expertise team is expensive, and the diversity of the crowd adds to its richness. Good examples include Swype and Livemocha.

Work with a selected crowd chosen for their subject matter expertise. When such selective hiring happens, the crowd is quite similar to an internal resident content team except they are not officially a part of the product team. This strengthens the confidence factor in the content they create, minimizing the need for rigorous content verification, yet offering the benefits of crowdsourced content.

Leverage the crowd to verify the content. If the application or product is an open forum such as Wikipedia, some of the content verification will be self-imposed, where the crowd itself validates and verifies the content created by the community. Depending on what the product is, you can leverage this technique as a standalone measure or as a supplement to internal verifications.

Interestingly, a study comparing Encyclopaedia Britannica and Wikipedia showed that Wikipedia was not very far behind on content accuracy, thanks to the self-correcting techniques that the community offers.

Promote quality assurance. You can achieve this by defining very clear content acceptance criteria upfront to gate the quality of the content flowing in.

Maintain a core resident content team. It is important to have a core team that performs in-depth and spot checks on crowdsourced content to verify existing material and define ongoing corrective measures as needed to maintain desired levels of quality.

Up Next

About the Author

TechWell Insights To Go

(* Required fields)

Get the latest stories delivered to your inbox every month.