The NewsLynx impact tracker produced these key ideas
An Executive Summary of Keller and Abelson's work, and a link to the full report.
“What’s working?”, to put it simply, is a question at the heart of the news industry’s push towards impact assessment. Almost two years ago, a pair of prescient Tow Center research fellows, Michael Keller and Brian Abelson, set out to build a tool that would help journalists answer that question. The NewsLynx platform is one result, the other is a new realm of knowledge. In this post below, the authors summarize their top-level lessons. You can download the full report as a pdf to read in detail.
With the rise of non-profit, foundation-funded newsrooms, the field of Measurement and Evaluation (M&E), which emerged in the international development community, has taken a strong foothold in journalism. As non-profit newsrooms apply for grants and appeal to donors for funding, they often need to explain in formal reports “how well” their stories performed – not just in terms of impressive traffic, but in qualitative evaluations of the “impact” their reporting had on the world: Did it change a law? Did it move the needle in the conversation? Did it meet the expectations — however defined — the organization had for it?
Based on survey research and interviews with newsrooms regarding current impact measurement practices, the researchers designed and built a new analytics platform called NewsLynx to improve upon existing methods of displaying quantitative metrics and to add qualitative information previously non-existent in such tools. Many newsrooms found current analytics tools insufficient for fully capturing their output’s performance. They had trouble seeing comparisons between audience reactions to stories, or the effects of their social media and promotional efforts. While they often had multiple data sources — Google Analytics, Omniture etc. — putting these numbers into context was difficult.
The NewsLynx project implements three significant ideas.
- NewsLynx seeks to augment metrics with context.
It shows how an article performs in comparison to the average of all a publication’s articles, and allows comparisons within subsets – all immigration articles for example, or within any user-defined category.
- NewsLynx also provides efficient tools for tracking, categorizing and assessing indicators of impact aside from audience reach.
Such impact indicators might be legislative reform, or community action. This has previously proved extremely difficult and time consuming. NewsLynx’s “Approval River” functionality aims to reduce the effort associated with managing the traditional clip searches and social media searches that newsrooms use to monitor impact. Crucially, it allows users to apply consistent (and therefore comparable) meta-data to impact indicators.
- The NewsLynx developers propose an impact framework that allows for the fact that real world impact measures are often messy and hard to categorize.
NewsLynx implements a framework that gives newsrooms enough structure to categorize “impactful events” across similar boundaries, while also providing enough freedom for a newsroom to create its own impact definition that matches its particular goals. Importantly, the researchers believe that successful, longterm impact measurement can only result from identifying such organizational goals.
Key observations and recommendations
- Effective impact measurement has to be tied to an organization’s goals.
No amount of technology can help an organization measure what it hasn’t defined as important. Should a newsroom’s reporting seek to change the narrative around an issue? Does it want to reach certain stakeholders or effect lasting reform? Only after an organization has understood what it wants to achieve can quantitative and qualitative tools assess how close the organization is to that goal.
- Both quantitative and qualitative metrics have a place in impact measurement.
Quantitative metrics are often vilified as leading journalism astray from its true purpose, but the authors found they do help tell the story about a newsroom’s performance. Although we started this project as a way to give more visibility to qualitative measurements, we repeatedly heard reports from newsrooms that quantitative measurements played an important role for organizations wanting to tell a long-term story of audience growth.
- Newsrooms should better tag their articles.
Newsrooms who want to properly understand their own performance over time, should put more care into tagging and cataloging their stories. These practices can give an organization a better understanding of its own operations and how much space it devotes to each subject. They also give staff the ability to perform myriad analyses comparing stories and packages. Without differentiating and labeling content, it is difficult to understand patterns in traffic or impact.
- Newsrooms have metrics but still many questions, particularly about audience.
As one newsroom put it, “Google Analytics feels both too complicated and not powerful enough for the questions we want to answer about readers.” Many existing metrics aren’t designed to help analyse metrics from a reader’s perspective — what did they think about the story? Did they leave after the fifth graf because they understood the newsy part of it and didn’t need anymore or was the site design wrong or the prose too dense? Nor did common tools provide enough insight into that relationship between the news organization and the audience.
- Custom analytics solutions have recently become more feasible.
With the continuing maturation of open source analytics pipelines, it is now possible for news organizations to own their entire analytics stack and not have to rely on third party vendors for the data collection portion of their metrics. In other words, the next few years could see newsrooms access much more diverse offerings, providing faster analysis and greater detail, which is more relevant to journalism. That being said, these pipelines are largely for data collection, so most newsrooms would need to design and implement their own custom interfaces to interpret this data for the average reporter and editor.
You can download the full report as a pdf to read in detail.