Announcements, Past Events, Research

Sensors and Certification

0

This is a guest post from Lily Bui, a sensor journalism researcher from MIT’s Comparative Media Studies program.

On October 20, 2014, Creative Commons Science convened a workshop involving open hardware/software developers, lawyers, funders, researchers, entrepreneurs, and grassroots science activists around a discussion about the certification of open sensors.

To clarify some terminology, a sensor can either be closed or open. Whereas closed technologies are constrained by an explicitly stated intended use and design (e.g., an arsenic sensor you buy at Home Depot), open technologies are intended for modification and not restricted to a particular use or environment (e.g., a sensor you can build at home based on a schematic you find online).

Over the course of the workshop, attendees listened to sessions led by practitioners who are actively thinking about whether and how a certification process for open hardware might mitigate some of the tensions that have arisen within the field, namely around the reliability of open sensor tools and the current challenges of open licensing. As we may gather from the Tow Center’s Sensors and Journalism report, these tensions become especially relevant to newsrooms thinking of adapting open sensors for collecting data in support of journalistic inquiry. Anxieties about data provenance, sensor calibration, and best practices on reporting sensor data also permeate this discussion. This workshop provided a space to begin articulating the needs required for sensor journalism to move forward.

Below, I’ve highlighted the key points of discussion around open sensor certification, especially as they relate to the evolution of sensor journalism.

Challenges of Open Sensors

How, when, and why do we trust a sensor? For example, when we use a thermometer, do we think about how well or often it has been tested, who manufactured it, or what standards were used to calibrate it? Most of the time, the answer is no. The division of labor that brings the thermometer to you is mostly invisible, yet you inherently trust that the reading it gives is an accurate reflection of what you seek to measure. So, what is it that instantiates this automatic trust, and what needs to happen around open sensors for people to likewise have confidence in them?

At the workshop, Sonaar Luthra of Water Canary led a session about the complexities and challenges that accompany open sensors today. Most concerns revolve around accuracy, both of the sensor itself and the data it produces. One reason for this is that the manufacture and integration of sensors are separate processes (that is to say, for example, InvenSense manufactures an accelerometer and Apple integrates it into the iPhone). Similarly, within the open source community, the development and design of sensors and their software are often separate processes from an end user’s assembly—a person looks up the open schematic online, buys the necessary parts, and builds it at home. This division of labor erodes the boundaries between hardware, software, and data, obviating a need to recast how trust is established in sensor-based data.

For journalists, a chief concern around sensor data is ensuring, with some degree of confidence, that the data collected from the sensor is not erroneous and won’t add misinformation to the public sphere if published. Of course, this entirely depends on how and why the sensor is being used. If we think of accuracy as a continuum, then the degree of accuracy can vary depending on the context. If the intent is to gather a lot of data and look at general trends—as was the case with the Air Quality Egg, an open sensor that measures air quality—point-by-point accuracy is less of a concern when engagement is the end goal. However, different purposes and paradigms require different metrics. In the case of StreetBump, a mobile app that uses accelerometer data to help identify potential potholes, accuracy is a much more salient issue as direct intervention from the city would mean allocating resources and labor toward location data a sensor suggests. Thus, creating a model to work toward shared parameters, metrics, resources, and methods might be useful to generate consensus and alleviate factors that threaten data integrity.

There may also be alternative methods for verification and accounting for known biases in sensor data. Ushahidi’s Crowdmap is an open platform used internationally to crowdsource crisis information. The reports depend on a verification system from other users for an assessment of accuracy. One can imagine a similar system for sensor data, pre-publication or even in real time. Also, if a sensor has a known bias in a certain direction, it’s also possible to compare data against an established standard (e.g., EPA data) and account for the bias in reporting on the data.

To further investigate these questions, we can look toward extant models of verification in open science and technology communities. The Open Geospatial Consortium provides a way of thinking about interoperability among sensors, which requires that a consensus around standards or metrics be established. Alternatively, the Open Sensor Platform suggests ways of thinking about data acquisition, communication, and interpretation across various sensor platforms.

Challenges of Open Licensing for Sensors

A handful of licensing options exist for open hardware, including the CERN Open Hardware License, Open Compute Project License, and Solderpad License. Other intellectual property strategies include copyright (which can be easily circumvented and is sometimes questionable when it comes to circuits), patenting (which is difficult and costly to attain), and trademark (an option that offers a lower barrier to entry and would best meet the needs of open source approaches). However, the issue of whether or not formal licensing should be applied to open hardware is still questionable, as it would inevitably impose restrictions on a design or version of hardware that—within the realm of open source—is still susceptible to modification by the original developer or the open source community writ large. In other words, a licensing or certification process would transition what is now an ongoing project into a final product.

Also, in contrast to open software, wherein the use of open code is clearly demarcated and tracked by the process of copying and pasting, it is less clear at what point a user actually agrees to using open hardware (i.e., upon purchase or assembly, etc.) since designs often involve a multitude of components and are sometimes accompanied by companion software.

A few different approaches to assessing open sensors emerged during the workshop:

  1. Standards. A collaborative body establishes interoperable standards among open sensors, working for independent but overlapping efforts. (Targeted toward the sensor.)
  2. Certification/Licensing. A central body controls a standard, facilitates testing, and manages intellectual property. (Targeted toward the sensor.)
  3. Code of conduct. There exists a suggestion of uses and contexts for the sensor, i.e., how to use it and how not to use it. (Targeted toward people using the sensor.)
  4. Peer assessment. Self-defined communities test and provide feedback on sensors as seen in the Public Lab model. (Targeted toward the sensor but facilitated by people using it.)

In the case of journalism, methods of standardization would depend on how much (or little) granularity of data is necessary to effectively tell a story. In the long run, it may be that the means of assessing a sensor will be largely contextual, creating a need to develop a multiplicity of models for these methods.

Preliminary Conclusions

While there is certainly interest from newsrooms and individual journalists in engaging with sensor tools as a valid means for collecting data about their environments, it is not yet apparent what newsrooms and journalists expect from open sensors and for which contexts open sensor data is most appropriate. The products of this workshop are relevant to evaluating what standards—if any—might be necessary to establish before sensors can be more widely adapted into newsrooms.

In the future, we must keep some important questions in mind: What matters most to newsrooms and journalists when it comes to trusting, selecting, and using a sensor tool for reporting? Which sensor assessment models would be most useful, and in which context(s)?

With regard to the certification of open sensors, it would behoove all stakeholders—sensor journalists included—to determine a way to move the discourse forward.

References

  1. Pitt, Sensors and Journalism, Tow Center for Digital Journalism, May 2014.
  2. Bourbakis and A. Pantelopoulos, “A Survey on Wearable Sensor-based Systems for Health Monitoring and Prognosis,” Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol. 40, Iss. 1 (IEEE, Jan. 2010).
  3. Open Source Hardware Association (OSHWA), Definition page.

Announcements, announcements-home

Upcoming Events

4

All-Class Lecture: The New Global Journalism

Tuesday, Sep. 30, 2014, 6:00pm

(Lecture Hall)

Based on a new report from the Tow Center, a panel discussion on how digital technology and social media have changed the work of journalists covering international events. #CJSACL

Panelists include report co-authors: 

Ahmed Al OmranSaudi Arabia correspondent at The Wall Street Journal

Burcu BaykurtPh.D. candidate in Communications at Columbia Journalism School

Jessie GrahamSenior Multimedia Producer at Human Rights Watch

Kelly Golnoush NiknejadEditor-in-Chief at Tehran Bureau

Program will be moderated by Dean of Academic Affairs, Sheila Coronel

Event begins at 6 PM

RSVP is requested at JSchoolRSVP@Columbia.edu

Announcements, announcements-home, Events

Upcoming Events

0

Mapping Issues with the Web: An Introduction to Digital Methods

How can digital traces be used to understand issues and map controversies? Presenters: Liliana Bounegru and Jonathan Gray

Tuesday, September 23, 5-6:30

RSVP Required Via Eventbrite

Liliana Bounegru and Jonathan Gray

Liliana Bounegru and Jonathan Gray

On the occasion of Bruno Latour’s visit to Columbia University, this presentation will show participants how to operationalize his seminal Actor-Network Theory using digital data and methods in the service of social and cultural research.

Participants will be introduced to some of the digital methods and tools developed at the University of Amsterdam and Sciences Po over the past decade and how they have been used to generate insights around a wide variety of topics, from human rights to extremism, global health to climate change.

Professor Bruno Latour will provide a short response to this presentation and join the subsequent discussion.

Liliana Bounegru and Jonathan Gray collaborated to produce the popular, prescient ‘Data Journalism Handbook‘, published in 2012. They are currently working on a new project exploring how journalists can use new digital tools and methods developed by social science researchers to transform coverage of complex issues and events – using the Paris 2015 climate negotiations as a case study.

Please RSVP via Eventbrite

Announcements, announcements-home, Events, Research

Upcoming Events

0

All-Class Lecture: The New Global Journalism

Tuesday, Sep. 30, 2014, 6:00pm

(Lecture Hall)

Based on a new report from the Tow Center, a panel discussion on how digital technology and social media have changed the work of journalists covering international events. #CJSACL

Panelists include report co-authors: 

Ahmed Al OmranSaudi Arabia correspondent at The Wall Street Journal

Burcu BaykurtPh.D. candidate in Communications at Columbia Journalism School

Jessie GrahamSenior Multimedia Producer at Human Rights Watch

Kelly Golnoush NiknejadEditor-in-Chief at Tehran Bureau

Program will be moderated by Dean of Academic Affairs, Sheila Coronel

Event begins at 6 PM

RSVP is requested at JSchoolRSVP@Columbia.edu

Announcements, announcements-home, Events

Upcoming Tow Event: Just Between You and Me?

1

Just between you and me?

(Pulitzer Hall – 3rd Floor Lecture Hall)

In the wake of the Snowden disclosures, digital privacy has become more than just a hot topic, especially for journalists. Join us for a conversation about surveillance, security and the ways in which “protecting your source” means something different today than it did just a few years ago. And, if you want to learn some practical, hands-on digital security skills—including tools and techniques relevant to all journalists, not just investigative reporters on the national security beat—stick around to find out what the Tow Center research fellows have in store for the semester.

The event will be held at 6 p.m. on Monday, August 25th in the 3rd Floor Lecture Hall of Pulitzer Hall. We welcome and encourage all interested students, faculty and staff to attend.

Announcements, Events, Past Events, Research

Digital Security and Source Protection For Journalists: Research by Susan McGregor

3

EXECUTIVE SUMMARY

The law and technologies that govern the functioning of today’s digital communication systems have dramatically affected journalists’ ability to protect their sources.  This paper offers an overview of how these legal and technical systems developed, and how their intersection exposes all digital communications – not just those of journalists and their sources – to scrutiny. Strategies for reducing this exposure are explored, along with recommendations for individuals and organizations about how to address this pervasive issue.

 

DOWNLOAD THE PDF

GitBookCover

 

 

 



Order a (bound) printed copy.

Comments, questions & contributions are welcome on the version-controlled text, available as a GitBook here:

http://susanemcg.gitbooks.io/digital-security-for-journalists/

DIGITAL SECURITY AND SOURCE PROTECTION FOR JOURNALISTS

Preamble

Digital Security for Journalists A 21st Century Imperative

The Law: Security and Privacy in Context

The Technology: Understanding the Infrastructure of Digital Communications

The Strategies: Understanding the Infrastructure of Digital Communications

Looking Ahead

Footnotes

 

Announcements, Events, Past Events, Research, The Tow Center

Tow Center Program Defends Journalism From the Threat of Mass Surveillance

3

Knight Foundation supports Journalism After Snowden to ensure access to information and promote journalistic excellence. Below, Jennifer Henrichsen, a research fellow at the Tow Center for Digital Journalism at Columbia Journalism School, and Taylor Owen, research director, write about the expansion of the program.

We’ve long known that it’s easy to kill the messenger. Journalists are murdered all around the world for speaking truth to power. But, it wasn’t until recently that we realized how mass surveillance is killing source confidentiality, and with it, the very essence of journalism. By taking away the ability to protect sources—the lifeblood of journalism—surveillance can silence journalists without prosecutions or violence. Understanding the implications of state surveillance for the practice of journalism is the focus of our project, Journalism After Snowden.

We’re in an age of mass surveillance and it’s expanding. Metadata can reveal journalists’ sources without requiring officials to obtain a subpoena. Intelligence agencies can tap into undersea cables to capture encrypted traffic. Mobile devices, even when powered off, can be remotely accessed to record conversations. The extent of manipulation and penetration of the technology that journalists rely on to communicate with their sources makes it difficult—if not impossible—for journalists to truly protect them. And without reasonable assurances of protection, sources will invariably dry up, cutting off a supply of information about government wrongdoing which for more than a century has been a critical balance of power in democratic governance. And journalism without sources is not journalism at all; it’s public relations for the powerful.

So what can we do? With generous funding from The Tow Foundation and Knight Foundation, the Tow Center for Digital Journalism at Columbia Journalism School seeks to address what we think are three core challenges facing journalism in the age of state surveillance.

First, more journalists and news organizations need to take source protection seriously. They need to conduct risk assessments and embrace digital security tools and techniques. They need to arm themselves with knowledge of their legal rights—or lack thereof—and conduct a thorough audit of how the technology platforms they use retain and release data. And more news organizations should consider implementing technologies likeSecureDrop, an open-source whistleblower submission system, which enables media organizations to more securely accept documents from anonymous sources.

Second, we need to strengthen collaboration between journalists and technologists. Bridging this professional divide is critical to ensuring journalists can reach out to trusted technologists for expertise and technologists can better understand the challenges that journalists face and create more user-friendly tools that address their needs. Journalists also need to be more skeptical when problems with their devices arise. Rather than immediately running to the Apple store to wipe their devices (which can actually hide the problem), journalists should enlist technologists to help determine if there is a more sinister cause than simple equipment malfunction. Researchers and technologists also need to join together to develop a system to collect and anonymize data showing digital attacks against journalists so researchers can analyze these attacks, ascertain potential trends and identify possible solutions.

Third, journalist educators and journalism schools need to discuss how to integrate digital security curricula into their classrooms. Currently, most journalism professors provide ad hoc digital security education—if they do at all. Digital security education needs to become more mainstream in journalism classrooms to ensure emerging journalists are cognizant of the real risks they and their sources face in this changing environment, and to foster the confidence they need to better protect both.

The Journalism After Snowden Project seeks to contribute high-quality conversations and research to strengthen the national debate around state surveillance and freedom of expression. The initiative will feature a yearlong series of events, research projects and articles that we will publish in coordination with Columbia Journalism Review, and it will forge new partnerships with the individuals and organizations that are already doing great work in this space. These will include: a workshop bringing together technologists and journalists in San Francisco, a public lecture by Glenn Greenwald; a lecture series in partnership with the Yale Information Society Project; an edited volume likely to be published by Columbia University Press; a poll on the digital security practices of investigative journalists to be published with Pew Research Center; several research reports on digital security teaching and training for journalists; and a conference on national security reporting in Washington, D.C.

By tackling these challenges together, we’ll help to prevent the death of journalism at the hands of mass surveillance and ensure journalism after Snowden is stronger, not weaker.

Announcements

Summer Sensor Newsroom Opens at the Tow Center

4

The Tow Center has launched the Tow Center Sensor Newsroom, a new seminar exploring how to use sensor data in journalism. Led by Tow Fellow Fergus Pitt, the intensive, 9am to 5pm, four-week lab is co-taught by industry professionals. The following are this year’s participants:

  • Seth Berkman
  • Matt Collette
  • Olivia Feld
  • Julien Alexandre Gathelier
  • Robert Helmut Hackett
  • Salma Magdy Amer
  • Elizaveta Malykhina
  • Nicholas Smith

Specialist topics are being taught by:

  • Mike Dewar (The New York Times R&D Lab)
  • John Keefe (WNYC Data News)
  • Ben Lesser (Daniel Pearl Award winner, New York Daily News, True Politics)
  • Lela Prashad (Nijel.org & NASA Satellite Sensing)
  • Kio Stark (WNYC Data News, OpenNews Source)
  • Julie Steele & Kipp Bradford (Data Sensing Lab)

Participants will report stories using a variety of sensor reporting methods, including custom prototypes, satellites, and official sources. The course will combine practice, theory and collaboration, drawing on techniques and ideas from recent stories run by innovative digital news rooms. We’ll also cover the legal and ethical issues triggered by sensor reporting; from the intellectual property and uncertainty questions to the privacy and surveillance considerations.

As data journalism rapidly becomes mainstream, more of our world is covered in sensors and the technical ability to use sensors permeates news rooms. Sensor-reported stories have already won grant funding and Pulitzer Prizes. But it’s a small, specialist world at the moment and people thinking about this area now will be set to make breakthroughs in the coming years.

At the end of the lab, participants will have produced innovative work. They will gain experience across the current landscape of sensors and understand the various kinds of stories that journalists have produced using sensors. We will be providing updates throughout the summer. Follow the class on Twitter #TowSense.

 

Announcements

Tow Center and Brown Institute Award Anne Ponton Waldman the Brown/Tow Award for Excellence in Computational Journalism

10

The Tow Center and Brown Institute for Media Innovation have awarded Columbia Journalism School student Anne Ponton Waldman their first-ever Brown/Tow Award for Excellence in Computational Journalism at Journalism Day, May 20, 2014.

Sponsored by the  Tow Center for Digital Journalism and Brown Institute for Media Innovation, the annual award honors work that makes exceptional use of computation in the service of journalism or makes an extraordinary contribution to our understanding of how data, code and algorithms change the nature of reporting. Waldman won for her master’s project, Between the Walls, which examined parole. 

Waldman created R code to scrape and clean data from the Parole Board website, and merge it with other data sets from the Department of Corrections.

“With my comprehensive data set, I provided an in-depth look at the parole board that had not been publicly analyzed before, as the New York State Parole Board officially states that they do not track statistics on parole releases related to variables such as age, race, or sentence,” wrote Waldman in her submission to the award committee.

The $2,000 award is awarded to a graduating M.S. or M.A. student at Columbia Journalism School. Waldman is in the M.S. dual-degree program with Columbia Journalism School and Columbia University’s School of International and Public Affairs.

“Working on this project and more generally learning about data journalism at Columbia has not only provided me with some pretty righteous coding skills, it has redefined that way I approach stories. In our quantified world where our digital footprint is producing data with every email, phone call, and movement, I believe it’s irresponsible to not try to open and understand the data that we are creating. In data, we could find some answers and of course uncertainty, but ultimately, in that data, we as journalists will find stories.

Lauren Mack is the Research Associate at the Tow Center. Follow her on Twitter @lmack.

Announcements

We’re Hiring: Tow Center Seeks Tow Center Administrator

9

The Tow Center is accepting applications for an Administrator at the Tow Center for Digital Journalism at Columbia Journalism School.

The incumbent will work closely with Tow Center Director Emily Bell and the Associate Research Director on the management of grant funds, research support for Tow projects, and day-to-day administrative tasks related to the grants, the Tow website, and Tow events.

Duties include:

  • Manage grant funds and process. Manage spending and ensures compliance with terms and restrictions for ongoing, proposed and new initiatives. Work closely with PI’s, funders, finance and budget managers to review budget proposals, submissions and revisions. Write and review grant proposals and reports.
  • Conduct research and support the research efforts of the Center. Organize and manage research files.
  • Manage planning and logistics for Tow Center events.
  • Manage website updates and functionality.
  • Perform related administrative duties for the Center.

Read more about the position and apply here.

Columbia University is an equal opportunity employer committed to creating and supporting a community diverse in every way: race, ethnicity, geography, religion, academic and extracurricular interest, family circumstance, sexual orientation, socio-economic background and more.

Minimum qualifications include bachelor’s degree, experience in administration, knowledge of journalism, digital media or emerging technologies, strong organizational skills and desire to engage in the research process. Experience with budget software, spreadsheets and financial management are preferred.