Wikipedia pages about big news events are edited so fast and furious it’s hard to keep track of why decisions were made, to make sure the editing is really optimized and to ensure that the most trustworthy sources get trusted and cited. A new experiment with crisis mapping organization Ushahidi is looking to change that. Ushahidi announced this morning that it has launched a new project called WikiSweeper, a new version of its open source hybrid of Tweetdeck and Google Reader called Sweeper that’s built for Wikipedia editors.
Sweeper aggregates all kinds of multimedia streams into an analytics-rich, real-time curation interface. As Wikipedians use the tool, analytics will be gathered to help Wikipedia study how high-pressure editing is being performed and to help Ushahidi further build out Sweeper.
“Sweeper can also be configured to be a passive filter for data,” Ushahidi explains, “meaning you can set it to aggregate content, then automatically perform certain tasks around that. ex. Aggregate all tweets from #hashtag tagged in the state of Maine and send only that data to another platform. When used in this way, Sweeper essentially becomes a smart cron tool equipped with geo-tagging, natural language processing and other power contextual features.” That sounds awesome. Check out the screenshots of Sweeper below and imagine that tool in the hands of Wikipedia editors, instead of just their own cobbled together research systems.
Below, the stream of news – click for full size.
The idea of pared-down old Wikipedia getting some custom-built and self-aware news power tools from Ushahidi – and of Ushahidi getting to learn from observing the thriving community of news editors in Wikipedia – is really exciting. If they can get User Experience taken care of well (something both organizations have struggled with) then I think the combined efforts of the two organizations could make them both all the more powerful sources of international real-time news. CNN just might eat their dust.