Past Events

Ta-Nehisi Coates at the Tow Center

0

By Alexandria Neason

Two weeks ago, students, alumni, faculty, and others packed the lecture hall at Columbia Journalism School to see The Atlantic’s Ta-Nehisi Coates, perhaps America’s foremost writer on race, speak about the media’s often contentious relationship with it.  Lately, the news has been more saturated with conversations about race than I can ever remember. Coverage of the policing and killing of black boys and men, of the burden of raising black children, of reparations and so-called “post-racial” multiculturalism has brought to the mainstream what people of color have long known.  America (still) has a race problem.

Photo by Rhon Flatts
Photo by Rhon Flatts

When I think about the history of American journalistic writing on race, it is difficult to separate the writing from the activism that it often highlights.  It’s hard to imagine white, Northern journalists traveling to the unapologetically violent 1960’s era South without some fledging belief that what was happening was wrong. It is hard to imagine that a journalist could cover the killings of Sean Bell, of Trayvon Martin and Oscar Grant and Jordan Davis and Mike Brown and all the others – without an understanding that something, somewhere, is deeply wrong.

I’d come to hear Coates speak because I wanted to know how he did it.  I wanted to know how he confronted the on-going legacy of American racism every day on his blog, subject to anonymous “keyboard commanders” as he referred to them.  I wanted to know how he dealt with what I can only assume were vile emails in response to his June cover story on the case for African-American reparations. I wanted to know how he wrote about racist housing policies and the constant loss of young, black life, without becoming disempowered by lack of change.  I’d come to hear how he kept going even when the idealisms of journalism – to affect change – proved elusive.

Coates talked at length about how he got his start in journalism. He spoke about dropping out of Howard University, and about his disinterest in romanticizing that.  He spoke about the importance of learning to report and to write.  He spoke of the difference between the two.  He spoke about waking up with aching questions and going to bed still bothered by them.  He spoke about reading, about writing constantly (even if it’s bad) as a means of practicing.  He talked about the absolute need to practice.  He told us that writing needs to be a top priority, below only family and health, if we hope to make a career out of it. He told us that if we didn’t love it, to leave.  “It’s just too hard,” he said.

And then he contradicted much of what I’d been taught about journalism. He told us not to expect to change anything with our writing.

I was startled. Wasn’t that the point? To educate, to inform, and ultimately, to change?

No. Coates doesn’t write to change the world. He doesn’t write to change the minds of white people (and he warned both white writers and writers of color of the dangers in doing this). “Don’t write to convince white people,” he said. I found in that statement what is perhaps the advice I needed in order to keep writing.

For a black man, writing about race in a country hard-pressed to ignore its long marriage to it, and doing so with precision and integrity and without apology is an act of defiance in and of itself.  Writing to speak, unburdening oneself of the responsibility of educating your opponents (and, in doing so, inadvertently educating a great deal of people), is how you keep touching on the untouchable subjects.

After the lecture, a small group of students gathered with Coates in the Brown Institute for a writing workshop sponsored by the Tow Center for Digital Journalism.  We were treated to a detailed walkthrough of Coates explosive June cover story on reparations for African-American descendants of slaves.  We learned about how he began his research, how he stayed organized, how he developed his argument and how it evolved over the year and a half that he worked on the piece.  It was clear that he was quite proud of it, not because he changed the minds of readers or because it had drawn so much attention to an issue often brushed off as impossible, but because he’d buried himself in the research, because he’d found a way to put a living, human face on the after-effects of policies that we often discuss as though they have none.  The piece is heavily reliant on data, but littered with human faces, human stories, human consequences.  It is deeply moving.  To me, it was convincing. Overwhelmingly, undeniably convincing.  And yet, his motivation was not to convince me, or anyone else, of anything.  He wrote to speak.

And speak he did.

Alexandria Neason is an education writer for the Teacher Project at Columbia Journalism School and a graduate of the M.S. in Journalism in 2014.

Past Events

Journalism After Snowden Lecture Series

0

Journalism After Snowden Lecture Series

Presented by: Columbia University Graduate School of Journalism and the Information Society Project at Yale Law School

 

September 29, 2014

6:00pm-7:30pm, with reception to follow

Brown Institute for Media Innovation at Columbia University Graduate School of Journalism

 

Source Protection: Rescuing a Privilege Under Attack

Speaker: David A. Schulz, Partner at Levine Sullivan Koch & Schulz, LLP

Moderator: Emily Bell, Director of the Tow Center for Digital Journalism

 

Watch Full Lecture 

David Schulz | Outside Counsel to The Guardian; Lecturer, Columbia Law School; Partner, Levine Sullivan Koch & Schulz LLP | @LSKSDave
David Schulz heads the New York office of Levine Sullivan Koch & Schulz, L.L.P. a leading media law firm with a national practice focused exclusively on the representation of news and entertainment organizations in defamation, privacy, newsgathering, access, copyright, trademark and related First Amendment matters. Schulz has been defending the rights of journalists and news organizations for nearly 30 years, litigating in the trial courts of more than 20 states, and regularly representing news organizations on appeals before both state and federal tribunals. Schulz successfully prosecuted access litigation by the Hartford Courant to compel the disclosure of sealed dockets in cases being secretly litigated in Connecticut’s state courts, and the challenge by 17 media organizations to the closure of jury selection in the Martha Stewart criminal prosecution. He successfully defended against invasion of privacy claims brought by Navy SEALS whose photos with injured Iraqi prisoners were discovered on-line by a reporter, and has prevailed in Freedom of Information Act litigation pursued by the Associated Press to compel the release of files relating to detainees held by the Department of Defense at Guantanamo Bay and to records of the military service of President George W. Bush. Schulz is described as an “incredibly skilled” litigation strategist and a “walking encyclopedia” of media law by Chambers USA (Chambers & Partners, 2006), and is recognized as one of the nation’s premier First Amendment lawyers by The Best Lawyers in America (Woodward/White, 2006). He regularly represents a broad range of media clients, including The New York Times, Associated Press, CBS Broadcasting, Tribune Company, and The Hearst Corporation, along with other national and local newspapers, television networks and station owners, cable news networks, and Internet content providers. Schulz is the author of numerous articles and reports, including Policing Privacy, 2007 MLRC Bulletin 25 (September 2007); Judicial Regulation of the Press? Revisiting the Limited Jurisdiction of Federal Courts and the Scope of Constitutional Protection for Newsgathering, 2002 MLRC Bulletin 121 (April 2002); Newsgathering as a Protected Activity, in Freedom of Information and Freedom of Expression: Essays in Honour of Sir David William (J. Beatson & Y. Cripps eds., Oxford University Press 2000); and Tortious Interference: The Limits of Common Law Liability for Newsgathering, 4 Wm. & Mary Law Bill Rts. J. 1027 (1996) (with S. Baron and H. Lane). He received a B.A. from Knox College in Galesburg, Illinois, where he has served for more than twenty years on the Board of Trustees. He received his law degree from Yale Law School, and holds a master’s degree in economics from Yale University.

 

 

Announcements, Past Events, Research

Sensors and Certification

0

This is a guest post from Lily Bui, a sensor journalism researcher from MIT’s Comparative Media Studies program.

On October 20, 2014, Creative Commons Science convened a workshop involving open hardware/software developers, lawyers, funders, researchers, entrepreneurs, and grassroots science activists around a discussion about the certification of open sensors.

To clarify some terminology, a sensor can either be closed or open. Whereas closed technologies are constrained by an explicitly stated intended use and design (e.g., an arsenic sensor you buy at Home Depot), open technologies are intended for modification and not restricted to a particular use or environment (e.g., a sensor you can build at home based on a schematic you find online).

Over the course of the workshop, attendees listened to sessions led by practitioners who are actively thinking about whether and how a certification process for open hardware might mitigate some of the tensions that have arisen within the field, namely around the reliability of open sensor tools and the current challenges of open licensing. As we may gather from the Tow Center’s Sensors and Journalism report, these tensions become especially relevant to newsrooms thinking of adapting open sensors for collecting data in support of journalistic inquiry. Anxieties about data provenance, sensor calibration, and best practices on reporting sensor data also permeate this discussion. This workshop provided a space to begin articulating the needs required for sensor journalism to move forward.

Below, I’ve highlighted the key points of discussion around open sensor certification, especially as they relate to the evolution of sensor journalism.

Challenges of Open Sensors

How, when, and why do we trust a sensor? For example, when we use a thermometer, do we think about how well or often it has been tested, who manufactured it, or what standards were used to calibrate it? Most of the time, the answer is no. The division of labor that brings the thermometer to you is mostly invisible, yet you inherently trust that the reading it gives is an accurate reflection of what you seek to measure. So, what is it that instantiates this automatic trust, and what needs to happen around open sensors for people to likewise have confidence in them?

At the workshop, Sonaar Luthra of Water Canary led a session about the complexities and challenges that accompany open sensors today. Most concerns revolve around accuracy, both of the sensor itself and the data it produces. One reason for this is that the manufacture and integration of sensors are separate processes (that is to say, for example, InvenSense manufactures an accelerometer and Apple integrates it into the iPhone). Similarly, within the open source community, the development and design of sensors and their software are often separate processes from an end user’s assembly—a person looks up the open schematic online, buys the necessary parts, and builds it at home. This division of labor erodes the boundaries between hardware, software, and data, obviating a need to recast how trust is established in sensor-based data.

For journalists, a chief concern around sensor data is ensuring, with some degree of confidence, that the data collected from the sensor is not erroneous and won’t add misinformation to the public sphere if published. Of course, this entirely depends on how and why the sensor is being used. If we think of accuracy as a continuum, then the degree of accuracy can vary depending on the context. If the intent is to gather a lot of data and look at general trends—as was the case with the Air Quality Egg, an open sensor that measures air quality—point-by-point accuracy is less of a concern when engagement is the end goal. However, different purposes and paradigms require different metrics. In the case of StreetBump, a mobile app that uses accelerometer data to help identify potential potholes, accuracy is a much more salient issue as direct intervention from the city would mean allocating resources and labor toward location data a sensor suggests. Thus, creating a model to work toward shared parameters, metrics, resources, and methods might be useful to generate consensus and alleviate factors that threaten data integrity.

There may also be alternative methods for verification and accounting for known biases in sensor data. Ushahidi’s Crowdmap is an open platform used internationally to crowdsource crisis information. The reports depend on a verification system from other users for an assessment of accuracy. One can imagine a similar system for sensor data, pre-publication or even in real time. Also, if a sensor has a known bias in a certain direction, it’s also possible to compare data against an established standard (e.g., EPA data) and account for the bias in reporting on the data.

To further investigate these questions, we can look toward extant models of verification in open science and technology communities. The Open Geospatial Consortium provides a way of thinking about interoperability among sensors, which requires that a consensus around standards or metrics be established. Alternatively, the Open Sensor Platform suggests ways of thinking about data acquisition, communication, and interpretation across various sensor platforms.

Challenges of Open Licensing for Sensors

A handful of licensing options exist for open hardware, including the CERN Open Hardware License, Open Compute Project License, and Solderpad License. Other intellectual property strategies include copyright (which can be easily circumvented and is sometimes questionable when it comes to circuits), patenting (which is difficult and costly to attain), and trademark (an option that offers a lower barrier to entry and would best meet the needs of open source approaches). However, the issue of whether or not formal licensing should be applied to open hardware is still questionable, as it would inevitably impose restrictions on a design or version of hardware that—within the realm of open source—is still susceptible to modification by the original developer or the open source community writ large. In other words, a licensing or certification process would transition what is now an ongoing project into a final product.

Also, in contrast to open software, wherein the use of open code is clearly demarcated and tracked by the process of copying and pasting, it is less clear at what point a user actually agrees to using open hardware (i.e., upon purchase or assembly, etc.) since designs often involve a multitude of components and are sometimes accompanied by companion software.

A few different approaches to assessing open sensors emerged during the workshop:

  1. Standards. A collaborative body establishes interoperable standards among open sensors, working for independent but overlapping efforts. (Targeted toward the sensor.)
  2. Certification/Licensing. A central body controls a standard, facilitates testing, and manages intellectual property. (Targeted toward the sensor.)
  3. Code of conduct. There exists a suggestion of uses and contexts for the sensor, i.e., how to use it and how not to use it. (Targeted toward people using the sensor.)
  4. Peer assessment. Self-defined communities test and provide feedback on sensors as seen in the Public Lab model. (Targeted toward the sensor but facilitated by people using it.)

In the case of journalism, methods of standardization would depend on how much (or little) granularity of data is necessary to effectively tell a story. In the long run, it may be that the means of assessing a sensor will be largely contextual, creating a need to develop a multiplicity of models for these methods.

Preliminary Conclusions

While there is certainly interest from newsrooms and individual journalists in engaging with sensor tools as a valid means for collecting data about their environments, it is not yet apparent what newsrooms and journalists expect from open sensors and for which contexts open sensor data is most appropriate. The products of this workshop are relevant to evaluating what standards—if any—might be necessary to establish before sensors can be more widely adapted into newsrooms.

In the future, we must keep some important questions in mind: What matters most to newsrooms and journalists when it comes to trusting, selecting, and using a sensor tool for reporting? Which sensor assessment models would be most useful, and in which context(s)?

With regard to the certification of open sensors, it would behoove all stakeholders—sensor journalists included—to determine a way to move the discourse forward.

References

  1. Pitt, Sensors and Journalism, Tow Center for Digital Journalism, May 2014.
  2. Bourbakis and A. Pantelopoulos, “A Survey on Wearable Sensor-based Systems for Health Monitoring and Prognosis,” Systems, Man, and Cybernetics, Part C: Applications and Reviews, Vol. 40, Iss. 1 (IEEE, Jan. 2010).
  3. Open Source Hardware Association (OSHWA), Definition page.

Events, Past Events

The Tow Responsive Cities Initiative

0

The Tow Responsive Cities Initiative
Workshop with Susan Crawford
Friday, 10/31 – 9:00 am

By invitation only

The extension of fiber optic high-speed Internet access connections across cities in America could provide an opportunity to remake democratic engagement over the next decade. Cities would have the chance to use this transformative communications capacity to increase their responsiveness to constituents, making engagement a two-way, nuanced, meaningful part of what a city does. The political capital that this responsiveness would generate could then be allocated to support big ideas that could address the problems facing many American cities, including growing inequality, diminishing quality of life, and movement of jobs outside the city’s borders.

announcements-home, Events, Past Events

Recap: Source Protection in the Information Age

0

“Assert the right to report.” That was the mandate Columbia’s Sheila Coronel gave our group of journalists and online privacy and security advocates this past Saturday morning, kicking off a day full of panels and workshop activities on the theme of “Source Protection in the Information Age.” In this post-Snowden age, we were reminded,
as scrutiny from the government and other authority structures intensifies, simple source protection becomes something more. As Aaron Williamson put it succinctly in the morning’s first panel: “Using encryption is activism. It’s standing up for your right to keep communications private.”

How to be an effective activist then? The day’s emphasis was intensely practical: Know your tools. We each had the opportunity to cycle through 6 of 14 available workshops. The spread covered effectively the typical activities journalists engage in: research, communication and writing. That translated into focuses on encrypted messaging via chat and email, location anonymous browsing via Tor, and access to desktop tools like the portable Tails operating system, which enables journalists to securely develop and store their research and writing. Snowden used Tails himself to escape the NSA’s scrutiny. We also received timely reminders about creating secure passwords and remembering that third parties are aware of our every move online.

Throughout, we were reminded of an important fact: You’re only as strong as your weakest participant. So journalists need not only to embrace these tools, they also need to educate their sources in how to use them effectively. They also need to learn how to negotiate the appropriate means and levels of security for communication with sources.

That’s where the user experience of these tools becomes so important. The most successful tools are bound to be those which are quick to install and intuitive to use. If some of those tools were as easy to download and install as a browser or plugin (Tor, Ghostery), others involved complex steps and technical knowledge, which might intimidate some users. That fact underlines the need to apply user-centered design principles to these excellent tools if they’re to be universally adopted. We have to democratize access to them.

Another tension point was the concern that using secure tools actually draws attention to the individual. A valid fear, perhaps, but the answer isn’t to abandon the tools but to employ them more often, even when security isn’t a concern. Increase the signal to noise ratio. On that note, the day was a success. Many of us, who were more or less aware of this issue, left not just enriched with more knowledge, but with laptops sporting a few more tools to empower us as activists.

Robert Stribley is the Associate Experience Director at Razorfish. You can follow him on Twitter at @stribs.

For resources and information about this and future events, visit our Source Protection: Resources page, and follow organizers/hosts Sandy Ordonez, Susan McGregor, Lorenzo Francesi-Biccherai and the Tow Center on Twitter.

announcements-home, Events, Past Events, Tips & Tutorials

Source Protection: Resources

2
We are happy to report that many of the attendees of our October 11 workshop on Source Protection in the Information Age left with a good foundation in digital security, and trainers gained a better understanding of the challenges journalists face in becoming more secure. 
This was a collaboratively organized event that brought together organizations and individuals passionate about the safety and security of journalists. We remain committed to continue supporting this collaboration, and will be planning future workshops. 
If you weren’t able to attend the event, we recommend starting with this brief recap. In addition, we would like to share some resources that you may find useful for continuing to develop your skills and understandings in this area.
Enjoy!
The organizers
(Lorenzo, Susan, Sandy & George)

Workshop Panel Videos

Panel 1: How technology and the law put your information at risk

Runa Sandvik, James Vasile, Aaron Williamson | Moderated by Jenn Henrichsen

Panel 2: Source protection in the real world – how journalists make it work

Online Resources

Workshop Resources

Online Library

Tactical Tech Collective

Tactical Tech’s Privacy & Expression program builds digital security awareness and skills of independent journalists, and anyone else who is concerned about the security risks and vulnerabilities of digital tools. On their website you can find manuals, short films, interactive exercises and well designed how-to’s. 

Upcoming Privacy & Security Events

October 20 | 6:30pm | Tracked Online:  How its done and how you can protect yourself
Techno-Activism 3rd Mondays (TA3M) is a community-run monthly meetup that happens in 21 cities throughout the world. It is a good place to meet and learn from individuals that work on anti-surveillance and anti-censorship issues. The October edition of NYC TA3M will feature former product lead of Ghostery who will explain how 3rd parties track you online, what information they collect, and what you can do to protect yourself. If you would like to be alerted of upcoming TA3m events, contact Sandra Ordonez @ sandraordonez@openitp.org
RSVP: 

Circumvention Tech Festival

The Circumvention Tech Festival will occur on March 1-6 in Valencia, Spain. The festival gathers the community fighting censorship and surveillance for a week of conferences, workshops, hackathons, and social gatherings, featuring many of the Internet Freedom community’s flagship events. This includes a full day of journo security events, which will be conducted both in English and Spanish. This is a great opportunity to meet the digital security pioneers. 
RSVP: 

 

Past Events

Journalism After Snowden – Upcoming events and activities

2

Journalism After Snowden – Upcoming events and activities

The recent beheadings of journalists Steven Sotloff and James Foley at the hands of the Islamic State of Iraq and the Levant (ISIL) are a horrific reminder that journalists are still murdered brutally by those seeking power and control.

In the United States, journalism faces less viscerally horrific realities, yet critical and timely questions remain for the future of journalism in an age of big data and surveillance. How can journalists protect their sources in an information age where metadata can reveal sources without a subpoena and where the prosecution of unsanctioned leakers is the highest it has been in years? What should journalists do when the tools they rely on for their news reporting facilitate data collection and surveillance?

We are seeking to address these questions in our yearlong Journalism After Snowden (JAS) initiative at the Tow Center for Digital Journalism, in collaboration with the Columbia Journalism Review. Read on to learn how you can get involved and contribute your voice to this important debate.

Attend lectures co-presented by the Tow Center and the Information Society Project at Yale Law School

In partnership with the Information Society Project at Yale Law School, we are hosting a fall lecture series looking at different challenges and opportunities facing journalism.

The first lecture in the series kicked off on Monday, September 29 and featured esteemed lawyer David A. Schulz who lectured on Source Protection: Rescuing a Privilege Under Attack. Schulz discussed the history as well as the current and possible future of the reporter’s privilege – an urgent topic in the wake of US courts’ decisions rejecting journalistic privilege for New York Times reporter James Risen. Watch the archived live-stream recording here.

We will continue the lecture series with events every month this fall. These include:

Investigative Reporting in a Time of Surveillance and Big Data – Steve Coll, Dean & Henry R. Luce Professor of Journalism at Columbia University Graduate School of Journalism

Tuesday, October 21, 12-1:30pm, Yale Law School, Room 122, 127 Wall Street, New Haven

Steve Coll, author of seven investigative journalism books and two-time Pulitzer Prize winner, will discuss the new environment for journalists and their sources. Register here.

Normalizing Surveillance – Ethan Zuckerman, Director, Center for Civic Media at MIT

Tuesday, November 18, 12-1:30pm, World Room, Columbia University

The default online business model – advertising-supported services and content – has normalized mass surveillance. Does that help explain the mixed public reaction to widespread surveillance by governments? Register here.

 

Journalism After Snowden – Jill Abramson, former Executive Editor of the New York Times

Tuesday, December 2, 12-1:30pm, Yale Law School, Room 122, 127 Wall Street, New Haven

Abramson will conclude the lecture series with a Journalism After Snowden discussion at Yale University. Click here to reserve your spot.

All lectures are free and open to the public, but you must RSVP to attend. All events will be live streamed to allow for remote participation.

Educate yourself about digital security and source protection

 

Workshop: Source Protection in the Information Age

Saturday, October 11, 8:30am-5pm, Pulitzer Hall, Columbia University

On October 11, the Tow Center, OpenITP, Mashable and Columbia Law School will host a one-day workshop on the essentials of source protection for journalists in the information age. The workshop will aim to answer practical and theoretical questions facing journalists who wish to implement digital security practices in their workflow.

The morning half of the workshop will feature panels of professional journalists who will discuss how they strategically use technology to both get the story and protect their sources. In the afternoon, attendees will attend small-group trainings on the security tools and methods that make the most sense for their particular publication and coverage area. Click here to register.

National Poll with Pew Research Center

In partnership with Pew, Columbia will conduct a survey of investigative journalists and their use of digital security tools, including what tools journalists use and do not use, how they conduct threat assessments and institutional support they receive.

 

In 2015, the Tow Center’s Journalism After Snowden program continues.

 

Book: Journalism After Snowden: The Future of Free Press in the Surveillance State

In fall 2015, Columbia University Press will publish a book of essays on the implication of state surveillance on the practice of journalism. The book titled Journalism After Snowden: The Future of Free Press in the Surveillance State will seek to be the authoritative volume on the topic and will foster intelligent discussion and debate on the major issues raised by the Snowden affair. Confirmed contributors include: Jill Abramson, Julia Angwin, Susan Crawford, Glenn Greenwald, Alan Rusbridger, David Sanger, Clay Shirky, Cass Sunstein, Trevor Timm, and Ethan Zuckerman, among others. Topics explored will include digital security for journalists, new forms of journalistic institutions, the role of the telecom and tech sectors, emerging civic medias, source protection and the future of investigative journalism, among other topics.

 

Conference: Journalism After Snowden: The Future of Free Press in the Surveillance State

Thursday, February 5, 2015, Newseum, Washington, D.C.

On February 5, 2015, the Tow Center will host a one-day conference at the Newseum in Washington, D.C., with a particular focus on the future of national security reporting in a surveillance state. Structured around the book of essays, this conference will bring together globally recognized panelists to debate the shifting place of journalism in democratic societies and will reveal fresh findings from Pew Research Center about digital security practices of journalists and the impact of surveillance on journalism.