In 1714, the British government held a contest.
They offered a large cash prize to anyone who could solve the vexing “longitude problem” — how to determine a ship’s east/west position on the open ocean — since none of their naval experts had been able to do so.
Lots of people gave it a try. One of them, a self-educated carpenter named John Harrison, invented the marine chronometer — a rugged and highly precise clock — that did the trick. For the first time, sailors could accurately determine their location at sea.
A centuries-old problem was solved. And, arguably, crowdsourcing was born. “It’s potentially a new way to do science.”
Crowdsourcing is basically what it sounds like: posing a question or asking for help from a large group of people. Coined as a term in 2006, crowdsourcing has taken off in the internet era. Think of Wikipedia, and its thousands of unpaid contributors, now vastly larger than the Encyclopedia Britannica.
Crowdsourcing has allowed many problems to be solved that would be impossible for experts alone. Astronomers rely on an army of volunteers to scan for new galaxies. Atclimateprediction.net, citizens have linked their home computers to yield more than a hundred million hours of climate modeling; it’s the world’s largest forecasting experiment.
But what if experts didn’t simply ask the crowd to donate time or answer questions? What if the crowd was asked to decide what questions to ask in the first place?
Could the crowd itself be the expert?
That’s what a team at the University of Vermont decided to explore — and the answer seems to be yes.
Prediction from the people
Josh Bongard and Paul Hines, professors in UVM’s College of Engineering and Mathematical Sciences, and their students, set out to discover if volunteers who visited two different websites could pose, refine, and answer questions of each other — that could effectively predict the volunteers’ body weight and home electricity use.
The experiment, the first of its kind, was a success: the self-directed questions and answers by visitors to the websites led to computer models that effectively predict user’s monthly electricity consumption and body mass index.
Their results, “Crowdsourcing Predictors of Behavioral Outcomes,” were published in a recent edition of IEEE Transactions: Systems, Man and Cybernetics, a journal of the Institute of Electrical and Electronics Engineers.
“It’s proof of concept that a crowd actually can come up with good questions that lead to good hypotheses,” says Bongard, an expert on machine science.
In other words, the wisdom of the crowd can be harnessed to determine which variables to study, the UVM project shows — and at the same time provide a pool of data by responding to the questions they ask of each other.
“The result is a crowdsourced predictive model,” the Vermont scientists write.
via University of Vermont
The Latest Streaming News: crowdsourcing updated minute-by-minute
Bookmark this page and come back often
Latest NEWS
Latest VIDEO