Sensors / Mark Hansen on A Landscape Of Historical Practice

June 5, 2013 by

Mark Hansen is the Head of The Brown Institute for Media Innovation at Columbia University and has spent much of the past year setting up the Brown Center’s operations. He also teaches journalism courses that integrate data, algorithms and computation.

The Tow Center asked Mark Hansen to start the weekend by surveying some of what we’ve already learned about sensing. As the Tow Center’s research director Taylor Owen noted introducing Hansen, “we’re just catching up with him.”

Mark Hansen started his talk by noting, “I want to emphasize that this isn’t some crazed techo-positivist talk about sensing curing the world’s ills,” he said, “but instead an attempt to highlight the creative potential of these technologies.”

Hansen worked as the co-Principal Investigator for the Center for Embedded Networked Sensing, an NSF Science and Technology Center committed to sensor network research. He pointed to previous work carried out at the center, specifically a presentation by Professor Deborah Estrin: “Wireless sensing systems: From ecosystems to human systems.” It posited that many critical issues facing science, government and the public call for high fidelity and real time observations of the physical world, and the answer may lie in embedded sensing systems that reveal the previously unobservable. In this utopian, “sensored” world, for instance, our cell phones might alert us when we were exposing ourselves to unhealthy environmental conditions—just as they can today alert us to traffic jams on the highway. Sensors would be embedded in the physical environment, networked to share information and adapt, and humans would participate and verify the data with reality checks.

Touchstone 1: Turing

Hansen pointed to the precedence for this thinking. First there was Alan Turing, who introduced ideas of distributing human sensing in the world: perhaps using microphones to hear, cameras to replace human eyes, and so forth. Perhaps, Turing said, these sensing instruments could even amount to a network so large that the “brain” portion would have to be left stationary spot while the sensors roamed the countryside and pose a danger to humans.

Touchstone 2: Igloo White in the Vietnam War

Deployment of actual sensors, according to Hansen, occurred as early as the 1930s where atmospheric scientists used radiosondes and balloon-borne instruments. Then in the 1960s, during the Vietnam War’s Igloo White operation, the US Air Force tossed sensors out of airplanes onto the Ho Chi Minh Trail to track the movement of men and supplies. But there were a few problems with this ad-hoc deployment method: there were power constraints (the sensors’ battery died after some time), there was no rhyme or reason to where the sensors landed, and the interplay between automated systems and human monitoring was lacking.

Touchstone 3: Smart Dust

Another key idea in the development of sensors was the notion of “smart dust.” Proposed in 2001, the concept was that smart dust could just be blown into the air and the tiny sensors would immediately start to gather information from wherever it landed. The only problem was that the sensors that existed then were incredibly clunky (think of the size of the chips in cell phones of the early 21st century). The idea was set aside.

Touchstone 4: Embedded Networked Sensing

Ultimately the notion of Embedded Networked Sensing overtook the practice of ad hoc deployment and its many failings—packets dropped, batteries died, sensors failed. The idea that you scattered sensors all over remained the same, Hansen explained, but mobility—through human intervention or robots—became an interesting solution to bridge the gaps of information between sensors placed at a distance from each other.

The Sense-able World, and Interpretation Skills

Hansen closed his keynote with a few final thoughts: First, while our conception of a sensor network changed, so did our view of the “sense-able” world – the two co-evolve in interesting ways. Second, data handling and data quality are two huge issues that anyone working with sensors is going to have to contend with.

Larger systems, he also noted, call for better programming abstractions. And, lastly, visibility into the system—whether participatory or “official science”—is key to knowing what you’re seeing. To help with these challenges, it is never too early to engage a statistician.

Category: Research

Post a comment

We're trying to advance the conversation, and we trust that you will, too. We'd rather not moderate, but we will remove any comments that are blatantly inflammatory or inappropriate. Let it fly, but keep it clean. Thanks.