Privacy, Security, & Ethics - Computer Science's "Jüdische Physik"

I'm going to tell you an anecdote which is a gross oversimplification of a complex topic.

In the early half of the twentieth century, certain physicists made breakthroughs in relativity, quantum mechanics, and nuclear energy. Many of these scientists were Jewish. The Nazis called these heretical ideas "Jewish Science" and suppressed their teaching.

Jewish physicists based in Germany fled the oncoming war. Many ended up in the USA where they worked on the Manhattan Project to develop nuclear weapons. The Nazis had caused such a "brain-drain" of expertise that it critically hampered their ability to wage atomic warfare.

It has long fascinated me that a culture expelled the set of people which could have saved it.

I'm going to tell you an anecdote which is a gross oversimplification, and is an unfair comparison.

In the early part of the twenty-first century, many people working in the fledgeling Internet industry started making noise about privacy, security, and ethics. The mainstream technologists called them fearmongers, idealists, and anti-business. Their ideas were unwelcome and they were thrown out of both the cathedral and the bazaar.

Many retreated to academia, some stayed and tried to cultivate a sense of responsibility in the industry, a few started lobbying governments around the world. By the time trust in the existing structures had begun to collapse, there were too few privacy-focused employees left to reverse the damage.

By expelling the boring and pessimistic doomsayers, the Internet behemoths had sowed the seeds of their own destruction.

All analogies break down eventually, and all simplifications obscure the truth. But there is an undeniable fact - Internet companies could have prevented their current difficulties if they had baked in privacy from the start. If they cared about their users' security. If they acted in an ethical manner.

But programmers want to concentrate on fun and exciting things, they don't want to be depressed by "experts" telling them they are acting irresponsibly.

In 1907, seventy-five people died when the Quebec Bridge collapsed. Ever since that day, Canadian engineers have worn an iron ring on their finger. Forged from the remnants of that bridge, it serves as a constant reminder that an engineer holds life in their hands. Mistakes can be deadly.

The computer industry has nothing like that. We have voluntary codes, which are mostly ignored. Programmers who commit blunders can shrug off responsibility. They face no professional sanctions and are sometimes lauded for their recklessness.

Indeed, one of our most sacred text proudly disclaims the very notion of a programmer being responsible for anything their code does:

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE

Everyone who loudly and publicly complained about the lack of privacy on the modern web was eventually proven right. Those who were initially dismissed as tinfoil-hat-wearing paranoid freaks, now have the grim satisfaction of being able to say "I told you so!"

The security experts who screamed their heads off about the gaping holes in consumer devices are modern day Cassandras.

I doubt that "Web 2.0" is facing irreversible collapse. But I also doubt that people who raise issues of ethics will be dismissed quite so casually in the future.

One thought on “Privacy, Security, & Ethics - Computer Science's "Jüdische Physik"

  1. When I was in college, "Computer Ethics" was a core class for any of our computer science degrees, including the ones that were just about managing tech projects. Unlike many of the other classes, where there was wiggle room for transfers and life experience and similar courses on campus, this was 100% required. If you didn't take it, you didn't graduate. Period.

    It was an amazing class, and one that I wish everyone had to sit through. We talked about times when a lack of QA killed people. We talked about the ethics of coding something where you're not trained in the actual field. Most importantly, we internalized the fact that we are sometimes the last barrier between the public and a very dangerous idea.

    That class comes to mind, almost twenty years later, every time I have to stand up to management and force them to acknowledge that the thing we're about to do is dangerous and/or unethical. Yes, I love coding cool things and playing with new toys, but at the end of the day, "Do no harm" should be the motto of the ethical coder.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.