Transparency bots hint at better uses for news algorithms

Editor’s note: This is a preview of a story that will be published in the summer 2014 print issue of Gateway Journalism Review.

Monitoring Wikipedia edits made from Russian government addresses, an automated tool caught controversial changes in the wake of the Malaysian Airlines Flight 17 crash in Ukraine this July. Someone at a state-run TV and radio network, VGTRK, anonymously removed mentions of Russian Federation-sourced missiles, swapping in Ukrainian soldiers as the culprits.

The program, or bot, that discovered the edit then automatically posted to Twitter, on its account @RUGovEdits. A similar bot, watching for changes from Boeing IP addresses, discovered edits to the article on Israel’s Iron Dome air defense system. The Boeing additions cast doubt on an analysis that found the system’s intercept rate to be low.

Both monitoring tools were created from the blueprints for @congressedits, which watches for changes made on Capitol Hill.

These simple automated alerts could be the beginning of a brighter future for journalist-algorithm relations. Rather than spitting out financial reports or sports recaps, technology put to this end reaches for the highest ideal of the profession – providing more transparency to the actions of government.

Perhaps more importantly, the transparency bots have been built from open source code. This has allowed others to quickly adapt the tool to their aims. Although most have simply shifted from one nation’s government to another’s, other possibilities abound.

Share our journalism