In the previous post I argued that, to really take off, citizen intelligence needs a good marketplace to create and stimulate a strong market. There are many different kinds of marketplace, and I briefly discussed both Airbnb and StackOverflow. These embody an important contrast, with one driven by supply, the other by demand. The latter is needed for citizen intelligence, but how exactly that might work remains to be explored.
The demand-driven crowdsourcing approach I regard as most promising is the sponsored challenge. The idea is that an organisation specifies, on a suitable platform, a problem to be solved and a prize pool. Individuals or teams enter solutions into the contest, and an evaluation process determines the prizewinner(s).
Sponsored challenges are already used to crowdsource sophisticated work of many different kinds. One example is the data analytics crowdsourcing platform Kaggle. An organisation posts, on Kaggle, a dataset and a description of the kind of analytics they’d like to see done on it, and a prize amount, e.g., $20,000. Individuals or self-organised teams of actual or aspiring data scientists submit their solutions as code, which Kaggle runs, in a fully automated manner, to determine how well the solutions perform. When the challenge period ends, the best performer takes home the prize and bragging rights.
A citizen intelligence sponsored challenge site would work in much the same way. A “customer” organisation needing intelligence on a particular topic would post a description of what they want, specify a prize pool, and set a deadline. Individuals or self-organised teams of citizen analysts would submit reports or “products.” The best reports are forwarded to the customer. (How these are selected is one of many tough problems I’ll address later.) The customer would indicate how the prize pool should be distributed to the report authors. At any time there would be many challenges on the platform, of varying durations.
Note that this is not a freelancer model. The customer does not contract citizen analysts to deliver a product for a specified sum of money. That said, a successful platform built on a sponsored challenge model might also support contract work. An analyst team might so impress a customer in a challenge that the customer can then commission products directly from that team.
Kaggle has been quite successful, which gives some basis for modest optimism regarding the challenge approach. At time of writing, Kaggle has twenty active challenges, or “competitions” in their lingo. Head of the list is one with a $100,000 prize, and 2,927 teams (individuals or groups) are having a crack. One Renjie Quan, graduate student at Fudan University, is top of the leaderboard. Kaggle was birthed as a humble startup in Melbourne, Australia and rose to global prominence, being purchased by Google in 2017.
Thus a citizen intelligence crowdsourcing site can be seen as “Kaggle for intelligence.” But intelligence, even intelligence analysis, is not data analytics. These are deeply different activities (though intelligence analysis can and increasingly does include data analytics). A model that works for one may not work for the other.
In a following post, I’ll discuss some challenges involved in getting a kind of Kaggle for citizen intelligence up and running. Then, in a subsequent post, I’ll discuss some of the serious problems that might arise if this platform managed to get any kind of serious scale.
Interested in a weekly digest of content from this blog? Sign up to the Analytical Snippets list. This blog, and the list, is intended to keep anyone interested in improving intelligence analysis informed about new developments in this area.
Thanks to Nick Youngson for the image.