Theknowledgecore's Blog

Complexity and Knowledge Management Navigators…

Caveat emptor! Crowd Sourcing and the answer to the million dollar question

This story starts with a conference Twitter feed and a Tweet in response to a conference discussion on knowledge sharing in a police force:

Participant A:  The rank structure in the police is a barrier to knowledge sharing: worrying

Participant B:  same in any hierarchy of course which is why the democratic nature of social media is compelling

Participant B:  Crowdsourcing solving crime suggested by @xxxx – what wld sherlock say? Baker St irregulars?

Now, let’s think about this for a moment…Democratic nature of social media and crowd sourcing…

Picture the game show, Who wants to be a millionaire? ….

Five hundred people in the audience, one lifeline left (ask the audience) and one question between you and one million [fill in the most valuable currency at this moment in time].  You look at the question on the screen; sweat trickles down your back; you don’t have a clue what the answer is.  You decide its time to ‘ask the audience’.  Five hundred people give you their opinions:  26% say ‘A’; 22% ‘B’; 28% ‘C’; and 24%D – what do you do?  This is a question with one correct answer, a simple problem, but who knows the correct answer – in this scenario between 72% and 78% of the audience are incorrect.  Wouldn’t you prefer to ‘phone a friend’, the one person you know would be able to answer this question, so that you could make the right decision!  Not having that choice, you either decide to not make a decision, and leave with half a million of the currency of your choice,  go with the majority (democracy in action) or gamble, trusting a signal from an inner sense…The half a million (insert currency of choice) decision!

Okay, you could argue that we are forcing people to participate, to make a choice, and therefore this is not a fair reflection of the crowd sourcing process.  So, what about a real world scenario in a real world organisation.

First, lets look at ‘A review of vital best practices in open innovation for enterprises, governments, universities and other organizations‘ and their 8 principles for successful crowd sourcing:

1.  Right Purpose:  Call to the crowd for insight you will act on

2.  Right Call:  Tell the crowd what you want from them

3.  Right Crowd:  Crowd needs to be diverse and qualified

4.  Right Incentives:  Glory, ego, altruism, greed – feed the crowd’s needs

5.  Right Model:  Define the output needed from the crowd

6.  Right promotion:  Get people in and get them spreading the word

7.  Right Community Management:  Nurture crowd participation and know when to chime in

8.  Right Technology:  Features vary, beware one size fits all

This is also interesting as the conference Twitter feed also produced this earlier Tweet from the people discussing crowd sourcing:

Participant B:  Very worrying talk about models – sound like recipe based plans rather frameworks for analysis

Seems like crowd sourcing has its own models for success, but I digress…back to my example:

Employees interviewed in a large multinational organisation stated that if they had a problem, where they didn’t know the answer and they didn’t know the right person to call, they would put out a ‘general’ organisation wide email asking for assistance.  Sometimes, if it was a simple problem, based on standard operating procedures, the response was overwhelmingly in favour of one solution:  “For £100, what colour means ‘go’ when you are waiting at a traffic light?”  Easy!  However, when the questions were more complicated or complex the responses became more varied.  Now, what is interesting is that in this scenario the respondents were not required to respond and the respondents believed they knew the answer.  The ‘crowd’ had an equal say, they chose to participate through self selection and yet the responses provided the variation used in the game show example.  The correct answer in this case was ‘D’, but the employee selected the majority answer and went with ‘C’.  The result was an upset client and a post project refund of $200,000.

You could argue that the ‘model’ applied in this case did not meet the precondition set out through the 8 principles listed above, but the bottom line is that the problem was addressed via crowd sourcing, the solution chosen by democratic means and, ultimately, it was wrong.

To be clear, I believe crowd sourcing does have its advantages, for example:

The Vancouver Police Department has put up a website entitled Hockey Riot 2011, informing people about the VPD′s investigations into the 2011 Stanley Cup Riot. It also asks people to contribute any pictures or video that they may have taken during the riot, with the goal of identifying people who may have participated in the rioting. The site also reminds people to not use social media to take justice into their own hands, instead leaving it to the police. As of July 1, 2011, 101 arrests have been made.

A simple problem with a simple solution.  Or you what about this:

In 2009, UK newspaper The Guardian was in trouble. Its rival broadsheet The Daily Telegraph had been publishing a series of spectacular accounts from leaked documents that exposed expenses fraud by some of the country’s politicians. It was the political story of the decade and the Telegraph had an exclusive that enabled it to grip the nation alone — at least until the government spoiled its party by making all of the documents available to the public.

When that happened though, The Guardian, like the rest of the press, was faced with the task of reading 457,153 pages of dumped expenses reports in order to separate MPs’ bills for paperclips and printer ink from claims for private moats and duck houses.

Unlike other media outlets though, The Guardian turned to its readers. As Michael Anderson of the Nieman Journalism Lab at Harvard University explains, the newspaper placed all of the documents on its servers then invited volunteers to sort through them, marking each page as “not interesting,” “interesting but known,” “interesting” or, most urgently,  “investigate this!”

Within 80 hours, Guardian readers had sorted through 170,000 documents with a visitor participation rate of 56 percent.

It was a perfect act of crowdsourcing, completing an important task that the newspaper could not have done alone. And it worked because the project combined all of the elements necessary for a successful appeal to the masses.

The bottom line, caveat emptor! I would also argue that the more complicated or complex the problem posed, the greater the variety of response, the greater risk to the decision making process, the greater the challenge for crowd sourced solutions.

One comment on “Caveat emptor! Crowd Sourcing and the answer to the million dollar question

  1. Pingback: Caveat emptor! Crowd Sourcing and the answer to the million dollar question | Knowledge Sharing |

So, what do you think?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s


This entry was posted on November 19, 2011 by in Knowledge Management, Learning organisations and tagged , , , .
%d bloggers like this: