Laws aren’t code.
Techies are generally pretty smart people. It’s difficult to make a living designing and writing software if you’re a dummy. They’re also often self-motivated learners, because that’s how you become a programmer.
That said, much of the tech community possess a pretty huge blind spot when it comes to regulations. They have unrealistic expectations of the efficiency of government, the well-meaning-ness and knowledge level of regulators, and the clarity and precision with which regulations can be written and enforced. And they seem utterly uninterested in ameliorating that ignorance.
Case in point: Maciej Cegłowski has a really terrific essay called “The Internet With a Human Face.” In it, he argues that the incentives driving online businesses, specifically businesses dependent on adverstising, lead to too much data collection, which brings terrible social costs. If you launch a new social network, for example, in order to get investors to give you large checks, you need to tell them a story about how you’ll eventually be able to use the data you’re gathering to sell super effective ads. Cegłowski calls this “investor storytime.” The really troubling thing is that it has “a vastly higher ROI than [actual] advertising. Startups are rational, and so that’s where they put their energy.” He goes on to outline lots of ways this is very bad. It’s a powerful argument and an important one.
But then we get to Cegłowski’s solution. It’s not education, nor is it greater reliance on techological opt-outs like Do Not Track. Nope, it’s—you guessed it—regulation. “It should be illegal to collect and permanently store most kinds of behavioral data,” he tells us. And he’s got a bunch of ideas about what those regulations should look like.
The trouble is, Cegłowski, like so many in the tech community, thinks regulations are magic spells. Write the incantation, have the state utter it, and—presto!—the world will conform to your desires.
@ARossP They think of laws like code, faithfully executed by a predictable CPU. Comp Sci has no public choice analysis.
— Adam Emmanuel Blackstone 🕊🥃🎲🚠🚀💹☦️ (@MeadBadger) June 3, 2014
Blackstone’s insight rings true. The tech community by-and-large thinks the world is easy, because the computers they work with every day are easy. Not easy in the sense of being simple to understand or possessing a gradual learning curve, because computer programming is hard. But “easy” in the sense of “working as expected, doing what you tell it to, and quickly upgraded or fixed if bugs appear.”
While that’s how computers work, it’s not how government works. When you invite government into problems, you bring interest groups, ignorance, and inertia. You create perverse incentives and turn important and ever-changing questions over to insitutions structually unable to handle them well. You bind tomorrow’s advances to today’s shortsightness, hobbling the pace of progress and entrenching incumbants, who will use your well-meaning rules to bludgeon more agile competitors.
With that in mind, here’s two of Cegłowski’s regulatory ideas.
1. Limit what kind of behavioral data websites can store. When I say behavioral data, I mean the kinds of things computers notice about you in passing—your search history, what you click on, what cell tower you’re using.
2. Limit how long they can keep it. Maybe three months, six months, three years. I don’t really care, as long as it’s not fifty years, or forever. Make the time scale for deleting behavioral data similar to the half-life of a typical Internet business.
The thing about technology is it changes faster than most of us can keep up with, and heads off in directions few of us can imagine. What mobile, connected computing will look like in ten years is a huge unknown. What kind of data it will need to do amazing things and what kind of data we’ll want it to have remains equally hazy. What are the chances that highly specific limits on data gathering written in 2014 will make any sense at all in 2024?
Do we want an arbitrary time limit baked into the law? Who knows what kind of realy cool and possibly life enhancing applications will emerge 20 or 30 years from now that will depend on deep troves of data?
Technological fixes to these concerns exist today. Don’t pass session data to websites. Then they can’t keep tabs on your clicks. Anonymize web traffic and location information via things like Tor. That not many people actually do this doesn’t mean “There Oughta Be a Law!” Instead it means either people don’t care about big data as much as Cegłowski does, or the technology isn’t easy enough to use. Addressing those issues directly makes a lot more sense then adding pages to the Federal Register and turning their enforcement over to men with guns and the politically connected interest groups and corporations who have their ear.
It’s not like techies are totally clueless when it comes to regulations in the real world, of course. We need only look at their disdain for the rules cities and powerful business interests use against Airbnb and Uber to see that.
But this makes it even more confusing why so many techies seem to forget these lessons when it comes to regulations hitting even closer to home.