Journalists and computer scientists increasingly are working together to develop innovative methods of reporting and telling news stories.
- Journalists are mashing up drones with GPS-equipped cameras to automatically create 3D models of newsworthy structures.
- Bots are programatically writing news stories that the public views as just as credible as those written by humans, according to one study.
- Computer scientists are creating software programs to help journalists identify and correct false rumors spreading on Twitter.
- Journalism students are using electronic sensors that monitor dust and noise to investigate construction sites.
- Journo-hackers are developing tools that use artificial intelligence to pull story ideas from big, complicated data sets.
All of these projects are among those being presented at the two-day Computation + Journalism Symposium 2014, which runs Friday and Saturday at Columbia University’s Brown Institute for Media Innovation.
The conference convenes academics and practitioners from journalism, computer science and data science who are exploring new techniques for finding and presenting important news stories.
American Journalism Review invited people presenting academic papers detailing cutting-edge research at the conference to write user-friendly summaries of their work. Their projects, which include a wide variety of computational journalism prototypes, tools, experiments and ideas, are described below, along with links to a full summary of each project.
AJR will also also provide real-time coverage of the symposium in a live blog.
Many journalists are looking to drones as cheap alternatives to news helicopters. But Matt Waite and Ben Kreimer of the University of Nebraska are most excited about the use of drones for data journalism. They used drones and a GPS-equipped camera to create a 3D model of an archeological dig site in Turkey — on deadline.
The logic behind rankings published by news organizations — of colleges, cars or the best cities in which to live — is often mysterious. Nick Diakopoulos, a computational journalist and assistant professor at the University of Maryland, and his colleagues built an interface that could make the ranking systems more transparent for users.
The fact-checkers at PolitiFact and FactCheck.org spend hours each day combing through news articles, transcripts of talk shows and campaign advertisements to find factual claims made by politicians. Finding and fact-checking those claims is hard work. Jun Yang, Bill Adair and others at Duke University are among those that have developed a pair of tools that automatically find and fact-check claims.
Investigative reporting projects that use sensors to gather data are becoming more common. To figure out how to teach sensor journalism, Fergus Pitt, a senior research fellow at the Tow Center for Digital Journalism at Columbia University, ran a pilot workshop for student journalists to investigate the environmental impact of a building site in New York.
Meredith Broussard, a data journalist and assistant professor at Temple University, built an artificial intelligence-based “Story Discovery Engine” to uncover a shortage of textbooks in Philadelphia. The concept could be used by reporters on other beats to find stories in large data sets, she explains.
Sometimes it’s hard to know when to retweet a juicy bit of information that may or may not be true. TRAILS, a system developed by Takis Metaxas, Eni Mustafaraj and Samantha Finn at Wellesley College, analyzes the tweet and retweet networks of a topic on Twitter to help users determine if a bit of information is true or false.
Information spreads on social media. So does a lot of misinformation. It’s hard for journalists to find — and shoot down — false rumors when 400 million tweets are sent on a typical day. Paul Resnick, a professor at the University of Michigan School of Information, describes tools developed by his research team that find potential rumors on Twitter and allow citizen journalists to screen them.
Some news organizations, like BBC News, have opened labs where engineers and computer scientists produce new tools to help journalists find and tell stories more effectively. Basile Simon, a “hacker journalist” at BBC News Labs, details two projects he and his colleagues developed.
Eva Constantaras, a data journalism consultant at Internews, describes an effort by her organization to make data-driven investigative journalism more common in developing countries like Kenya. The organization built a site, the Data Dredger, to give Kenyan journalists easier access to data.
Robot journalism has as much credibility with the public as news produced by human journalists, according to research conducted by Hille van der Kaa and Emiel Krahmer at Tilburg University in The Netherlands. But the researchers found journalists don’t hold the same view–they trust their own work more than computer-written news.
It took longer for data journalism to establish a foothold in the U.K. than in the U.S., but the practice is growing, says Jonathan Hewett, director of Interactive and Newspaper Journalism at City University London.
ABOUT THE CONFERENCE
The Computation + Journalism Symposium brings together journalists and technologists to collaborate on the development of new methods of finding and telling stories, explains Nick Diakopoulos, an assistant professor of computational journalism at the University of Maryland Philip Merrill College of Journalism.
AJR co-editor Sean Mussenden will provide live coverage of the ideas and issues discussed at the Computation + Journalism Symposium on Friday and Saturday.