Announcements, Events, Past Events

Upcoming Events: Journalism After Snowden

0

Journalism After Snowden Closing Event

On February 5th, 2015, the Tow Center will be hosting an event on National Security Reporting in the Age of Surveillance: A Conversation About Reporting Post-Snowden with Dean Baquet of the New York Times, Marty Baron of the Washington Post, Susan Glasser of Politico, and Steve Coll, of Columbia Journalism School.  The event will be at the Knight Conference Center at the Newseum in Washington, D.C. and will include the launch of a new Pew study.

More information about RSVP to come

1606909_791412234264266_7194605208538394787_n

Announcements, announcements-home, Research

Understand Your Internet: The Five W’s of Information Online

0

The Tow Center is pleased to announce the launch of its latest resource for newsrooms interested in improving their digital security and source protection practices.

“The What, When, Where, Why and How of Who Can See Your Information Online” is an illustrated overview of the mechanics of the Internet, including everything from how your computer connects to a wifi hotspot, to what is (and isn’t) protected by an https connection. The detailed illustration also includes explanations of what “metadata” is typically stored by digital companies and available to (U.S.) authorities, as well as demonstrating the functions of digital security technologies like Tor and encrypted email.

High-quality printed versions of the illustration are available for order (at cost) online. It is available in both a 24″x36″ (61cm x 91cm) wall-sized version, and an individual 12″x18″ (30cm x 45cm) size.

Order the poster online
TheInternetADataVisibilityOverview-low

A high-resolution version of the poster can also be downloaded via the GitBook here.

Announcements, announcements-home, Events

Tow Tea: Understanding the Role of Algorithms and Data at BuzzFeed

0

Tow Tea: Understanding the Role of Algorithms and Data at BuzzFeed
Thursday, Dec. 4, 2014
4:00 pm – 6:00 pm
The Brown Institute for Media Innovation
RSVP encouraged via Eventbrite

Ky Harlin, Director of Data Science for Buzzfeed will join us along with an editor and reporter from Buzzfeed. Together, they will help us understand the relationship between content and data—How does Buzzfeed predict whether a story will go viral?  What is shareability?  Do reporters and editors at Buzzfeed make editorial decisions based on input from data scientists who track traffic and social networks?  What is the day-to-day workflow like at Buzzfeed and how are methods employed different than those used in traditional newsroom settings?

Read more about Ky here: http://bit.ly/1n3xc3B

RSVP encouraged.  Tea, Coffee, and Dessert snacks will be served.

For questions about this event, please contact Smitha Khorana, Tow Center DMA, at sk3808@columbia.edu.

Announcements, Events

Tow Tea: FOIA Workshop

0

Tow Tea: FOIA Workshop
Wednesday, Nov. 12
4:00 pm – 5:00 pm
Pulitzer Hall – Room 601B

RSVP Recommended via Eventbrite: http://bit.ly/1sjtBA2

The Tow Center will conduct an hour-long workshop on how to file successful Freedom of Information Act requests with Shawn Musgrave, who works for Muckrock.

MuckRock is an open news tool powered by state and federal Freedom of Information laws. Shawn Musgrave is a student in the Lede program at Columbia Journalism School.

The workshop will cover tips and tricks to getting a quick return on FOIA requests—what to do when requests are stalled, how to address controversial or sticky subjects, and some of the legal concerns to be aware of when filing.

Announcements, Past Events, Research

Sensors and Certification

0

This is a guest post from Lily Bui, a sensor journalism researcher from MIT’s Comparative Media Studies program.

On October 20, 2014, Creative Commons Science convened a workshop involving open hardware/software developers, lawyers, funders, researchers, entrepreneurs, and grassroots science activists around a discussion about the certification of open sensors.

To clarify some terminology, a sensor can either be closed or open. Whereas closed technologies are constrained by an explicitly stated intended use and design (e.g., an arsenic sensor you buy at Home Depot), open technologies are intended for modification and not restricted to a particular use or environment (e.g., a sensor you can build at home based on a schematic you find online).

Over the course of the workshop, attendees listened to sessions led by practitioners who are actively thinking about whether and how a certification process for open hardware might mitigate some of the tensions that have arisen within the field, namely around the reliability of open sensor tools and the current challenges of open licensing. As we may gather from the Tow Center’s Sensors and Journalism report, these tensions become especially relevant to newsrooms thinking of adapting open sensors for collecting data in support of journalistic inquiry. Anxieties about data provenance, sensor calibration, and best practices on reporting sensor data also permeate this discussion. This workshop provided a space to begin articulating the needs required for sensor journalism to move forward.

Below, I’ve highlighted the key points of discussion around open sensor certification, especially as they relate to the evolution of sensor journalism.

Challenges of Open Sensors

How, when, and why do we trust a sensor? For example, when we use a thermometer, do we think about how well or often it has been tested, who manufactured it, or what standards were used to calibrate it? Most of the time, the answer is no. The division of labor that brings the thermometer to you is mostly invisible, yet you inherently trust that the reading it gives is an accurate reflection of what you seek to measure. So, what is it that instantiates this automatic trust, and what needs to happen around open sensors for people to likewise have confidence in them?

At the workshop, Sonaar Luthra of Water Canary led a session about the complexities and challenges that accompany open sensors today. Most concerns revolve around accuracy, both of the sensor itself and the data it produces. One reason for this is that the manufacture and integration of sensors are separate processes (that is to say, for example, InvenSense manufactures an accelerometer and Apple integrates it into the iPhone). Similarly, within the open source community, the development and design of sensors and their software are often separate processes from an end user’s assembly—a person looks up the open schematic online, buys the necessary parts, and builds it at home. This division of labor erodes the boundaries between hardware, software, and data, obviating a need to recast how trust is established in sensor-based data.

For journalists, a chief concern around sensor data is ensuring, with some degree of confidence, that the data collected from the sensor is not erroneous and won’t add misinformation to the public sphere if published. Of course, this entirely depends on how and why the sensor is being used. If we think of accuracy as a continuum, then the degree of accuracy can vary depending on the context. If the intent is to gather a lot of data and look at general trends—as was the case with the Air Quality Egg, an open sensor that measures air quality—point-by-point accuracy is less of a concern when engagement is the end goal. However, different purposes and paradigms require different metrics. In the case of StreetBump, a mobile app that uses accelerometer data to help identify potential potholes, accuracy is a much more salient issue as direct intervention from the city would mean allocating resources and labor toward location data a sensor suggests. Thus, creating a model to work toward shared parameters, metrics, resources, and methods might be useful to generate consensus and alleviate factors that threaten data integrity.

There may also be alternative methods for verification and accounting for known biases in sensor data. Ushahidi’s Crowdmap is an open platform used internationally to crowdsource crisis information. The reports depend on a verification system from other users for an assessment of accuracy. One can imagine a similar system for sensor data, pre-publication or even in real time. Also, if a sensor has a known bias in a certain direction, it’s also possible to compare data against an established standard (e.g., EPA data) and account for the bias in reporting on the data.

To further investigate these questions, we can look toward extant models of verification in open science and technology communities. The Open Geospatial Consortium provides a way of thinking about interoperability among sensors, which requires that a consensus around standards or metrics be established. Alternatively, the Open Sensor Platform suggests ways of thinking about data acquisition, communication, and interpretation across various sensor platforms.

Challenges of Open Licensing for Sensors

A handful of licensing options exist for open hardware, including the CERN Open Hardware License, Open Compute Project License, and Solderpad License. Other intellectual property strategies include copyright (which can be easily circumvented and is sometimes questionable when it comes to circuits), patenting (which is difficult and costly to attain), and trademark (an option that offers a lower barrier to entry and would best meet the needs of open source approaches). However, the issue of whether or not formal licensing should be applied to open hardware is still questionable, as it would inevitably impose restrictions on a design or version of hardware that—within the realm of open source—is still susceptible to modification by the original developer or the open source community writ large. In other words, a licensing or certification process would transition what is now an ongoing project into a final product.

Also, in contrast to open software, wherein the use of open code is clearly demarcated and tracked by the process of copying and pasting, it is less clear at what point a user actually agrees to using open hardware (i.e., upon purchase or assembly, etc.) since designs often involve a multitude of components and are sometimes accompanied by companion software.

A few different approaches to assessing open sensors emerged during the workshop:

  1. Standards. A collaborative body establishes interoperable standards among open sensors, working for independent but overlapping efforts. (Targeted toward the sensor.)
  2. Certification/Licensing. A central body controls a standard, facilitates testing, and manages intellectual property. (Targeted toward the sensor.)
  3. Code of conduct. There exists a suggestion of uses and contexts for the sensor, i.e., how to use it and how not to use it. (Targeted toward people using the sensor.)
  4. Peer assessment. Self-defined communities test and provide feedback on sensors as seen in the Public Lab model. (Targeted toward the sensor but facilitated by people using it.)

In the case of journalism, methods of standardization would depend on how much (or little) granularity of data is necessary to effectively tell a story. In the long run, it may be that the means of assessing a sensor will be largely contextual, creating a need to develop a multiplicity of models for these methods.

Preliminary Conclusions

While there is certainly interest from newsrooms and individual journalists in engaging with sensor tools as a valid means for collecting data about their environments, it is not yet apparent what newsrooms and journalists expect from open sensors and for which contexts open sensor data is most appropriate. The products of this workshop are relevant to evaluating what standards—if any—might be necessary to establish before sensors can be more widely adapted into newsrooms.

In the future, we must keep some important questions in mind: What matters most to newsrooms and journalists when it comes to trusting, selecting, and using a sensor tool for reporting? Which sensor assessment models would be most useful, and in which context(s)?

With regard to the certification of open sensors, it would behoove all stakeholders—sensor journalists included—to determine a way to move the discourse forward.

References

  1. Pitt, Sensors and Journalism, Tow Center for Digital Journalism, May 2014.
  2. Bourbakis and A. Pantelopoulos, “A Survey on Wearable Sensor-based Systems for Health Monitoring and Prognosis,” Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol. 40, Iss. 1 (IEEE, Jan. 2010).
  3. Open Source Hardware Association (OSHWA), Definition page.

Announcements, announcements-home

Upcoming Events

4

All-Class Lecture: The New Global Journalism

Tuesday, Sep. 30, 2014, 6:00pm

(Lecture Hall)

Based on a new report from the Tow Center, a panel discussion on how digital technology and social media have changed the work of journalists covering international events. #CJSACL

Panelists include report co-authors: 

Ahmed Al OmranSaudi Arabia correspondent at The Wall Street Journal

Burcu BaykurtPh.D. candidate in Communications at Columbia Journalism School

Jessie GrahamSenior Multimedia Producer at Human Rights Watch

Kelly Golnoush NiknejadEditor-in-Chief at Tehran Bureau

Program will be moderated by Dean of Academic Affairs, Sheila Coronel

Event begins at 6 PM

RSVP is requested at JSchoolRSVP@Columbia.edu

Announcements, announcements-home, Events

Upcoming Events

0

Mapping Issues with the Web: An Introduction to Digital Methods

How can digital traces be used to understand issues and map controversies? Presenters: Liliana Bounegru and Jonathan Gray

Tuesday, September 23, 5-6:30

RSVP Required Via Eventbrite

Liliana Bounegru and Jonathan Gray

Liliana Bounegru and Jonathan Gray

On the occasion of Bruno Latour’s visit to Columbia University, this presentation will show participants how to operationalize his seminal Actor-Network Theory using digital data and methods in the service of social and cultural research.

Participants will be introduced to some of the digital methods and tools developed at the University of Amsterdam and Sciences Po over the past decade and how they have been used to generate insights around a wide variety of topics, from human rights to extremism, global health to climate change.

Professor Bruno Latour will provide a short response to this presentation and join the subsequent discussion.

Liliana Bounegru and Jonathan Gray collaborated to produce the popular, prescient ‘Data Journalism Handbook‘, published in 2012. They are currently working on a new project exploring how journalists can use new digital tools and methods developed by social science researchers to transform coverage of complex issues and events – using the Paris 2015 climate negotiations as a case study.

Please RSVP via Eventbrite