San Diegans Shouldn’t Be Lab Rats for Innovation
In 2016, San Diego installed thousands of General Electric cameras, microphones and telecommunication devices on streetlights around the city. The City Council approved the project with little investigation, looking no further than the city’s casting of the project as environmental “sensors” and “nodes” that would analyze traffic and the atmosphere.
The city finally held town halls this year to explain the program to communities, but by then it was too late. Once installed, technologies of this type will outrun the uses for which they are designed and publicly justified. Over and over, researchers like myself have seen data creep — like mission creep — take hold as companies try to add value to data and monetize them.
It’s not surprising that the San Diego Police Department began accessing videos captured by the “environmental sensors” shortly after their installation and wrote their own policies for doing so. City officials say the footage is deleted after five days, but what’s to stop SDPD from holding any video it unilaterally deems essential in perpetuity?
Lessons in data creep — technology outrunning intended uses — are all around us. It costs companies money to store data. If users aren’t paying those costs, users become the product. Companies that store data commonly collect data for one purpose but make aspects of the data available to other companies to pay the bills and to grow their profits.
I saw this data creep firsthand during my four years as an engineer at Google — a tech company that started off as a search engine, exploded into personal data apps and now guides users around the city and researches self-driving cars. Private emails became fodder for ad targeting. Street View gave way to mapping apps pitched at intelligence agencies. Last year, Google workers protested the company’s contracts with the Pentagon, arguing against the mobilization of global consumer data to build applications for one nation’s military.
Facebook offers another example of data creep. People go to the platform to share stories, keep memories, make connections and, yes, waste time. But Facebook’s real customers are not its users but the countless companies that rely on the platform’s data to target advertisements or build profiles of us. Ads may seem innocuous enough, but Cambridge Analytica weaponized consumer targeting to manipulate voter behavior.
Even if we take profit out of it, there’s no such thing as a totally secure technology. It’s not uncommon to find an opaque notification in the mail saying your passwords and credit cards, stored on a computer somewhere, have been breached. Credit card companies and online retailers store records all over the internet. Any company’s computer code is interdependent on too many lines of code to check, written by many hands over decades.
That means that the safest data are data not collected in the first place. Data creep means there are economic and entrepreneurial pressures to use data in new ways. It means we can never guarantee who will get their hands on the data. We can’t guarantee what a company or bureaucrat might assume to be harmless or even “social good.” And chasing data creep with democratic oversight is like trying to control a black box that even system designers cannot completely understand.
So if data must be collected, it should be done with very strong justification. The promise that data will help address homelessness, traffic or safety is not enough. Data must be absolutely necessary, collected with the full knowledge and consent of the public, come with an ongoing ability to opt out and be safeguarded to ensure it doesn’t just benefit some at the expense of many.
If the people decide data collection is justified, there are ways to design hardware — sensors, geolocation and cameras, for example — that enforce more privacy. It’s possible, for instance, that location devices can tell what city you’re in but not where within the city. San Diego’s “smart” streetlights, by contrast, have the potential to greedily grab sound and video just in case a future system could use it.
San Diego’s anti-surveillance coalition of community groups rightly are demanding a seat at the table to design the city’s data infrastructure. They join citizens in New York City, San Francisco and Oakland in resisting the imposition of surveillance technologies like facial recognition and demanding democratic oversight processes for new “smart city” systems. The ACLU found that 50 percent of voters statewide strongly support public debate before the adoption of government and law enforcement surveillance technologies. But public opinion alone does not hold our governments accountable or data creep at bay. Organized citizen involvement can.
Big tech and innovation enthusiasts might prefer the public let them unilaterally design the technological future. They often argue that democratic oversight slows down innovation and entrepreneurship, or that design by committee yields poorer results. That is a short-sighted view of design that prioritizes shiny technological potential over the early mitigation of vast social risks.
San Diego promised entrepreneurs and engineers that they could innovate on the data gathered from our everyday lives without us knowing about it. We cannot let tech erode civil liberties in the name of innovation. We are not innovation’s lab rats.
Lilly Irani is an associate professor of communication and science studies at UC San Diego who specializes in the cultural politics of technology, innovation and entrepreneurship. She is the author of “Chasing Innovation: Making Entrepreneurial Citizens in Modern India.”