tim: Mike Slackernerny thinking "Scientific progress never smelled better" (science)
Tim Chevalier ([personal profile] tim) wrote2013-11-23 14:53

Hackers and Firefighters

A work of parody by Tim Chevalier, based on "Hackers and Painters" by Paul Graham.

The following is a work of fiction.

When I finished grad school in computer science, I decided I had just wasted eight years, and went to firefighter school to become a firefighter. A lot of people seemed surprised that someone interested in computers would also be interested in fighting fires. They seemed to think that hacking and firefighting were very different kinds of work: that hacking was an inner-directed pursuit of personal pleasure (a little like doing drugs, but slightly more socially acceptable), while firefighting involves self-sacrifice and taking risks for the benefit of others.

Both of these images are wrong. Hacking and firefighting have a lot in common. In fact, of all the different types of people I've known, hackers and firefighters are among the most alike.

What hackers and firefighters have in common is that they both like to jump into situations that most sensible people would steer clear of. Along with doctors, nurses, and traffic cops, what hackers and firefighters are trying to do, at least in part, is save other people from the consequences of their poor life decisions (without passing judgment on those decisions; or, at least, doing so quietly among friends after one gets the job done). They're not doing research per se, though if in the course of trying to mitigate disasters they discover some new technique, so much the better.

Hackers need to understand the theory of computation about as much as firefighters need to understand thermodynamics. You need to know how to calculate time and space complexity and about Turing completeness. You might also want to remember at least the concept of a state machine, in case you have to write a parser or a regular expression library. Firefighters in fact have to remember a good deal more about physics and chemistry than that.

I've found that the best sources of ideas are not the other fields that have the word "computer" in their names, but the other fields inhabited by public servants. Firefighting has been a much richer source of ideas than the theory of computation.

For example, I was taught in college that one ought to figure out a program completely on paper before even going near a computer. I found that I did not program this way. I found that I liked to program sitting in front of a computer, not a piece of paper. Worse still, instead of patiently writing out a complete program and assuring myself it was correct, I tended to just spew out code that was hopelessly broken, and gradually beat it into shape. Debugging, I was taught, was a kind of final pass where you caught typos and oversights. The way I worked, it seemed like programming consisted of debugging.

For a long time I felt bad about this, just as I once felt bad that I didn't extinguish a doobie the way my friends on the playground taught me to in elementary school. If I had only looked over at the other service workers, the nurses or the cops, I would have realized that there was a name for what I was doing: making shit up as I went along. As far as I can tell, the way they taught me to program in college was all wrong. Every situation is unique, and you should use your body of experience to deal with each crisis as it happens, just as firefighters and nurses and cops do.

Identifying with the public servants will save us from another problem that afflicts the sciences: math envy. Everyone in the sciences secretly believes that mathematicians are smarter than they are. I think mathematicians also believe this. At any rate, the result is that scientists tend to make their work look as mathematical as possible. In a field like physics this probably doesn't do much harm, but the further you get from the natural sciences, the more of a problem it becomes.

If hackers identified with other public servants, like firefighters and nurses, they wouldn't feel tempted to do this. Firefighters and nurses don't suffer from math envy. They feel as if they're doing something completely unrelated: that is, something that's important. So are hackers, I think.

The problem is that most hackers work in companies, who don't treat hackers the way public servants get treated. Many companies treat hackers as if they are a very scarce resource, which -- to the extent that it's true -- wouldn't be so true if the same companies would stop systematically creating hostile environments for women. Do nurses get twelve different kinds of free soda on the job, flexible hours, and unlimited PTO? I don't think so, and that's because we all realize the work they do is actually necessary. As a result, hackers lose sight of the fact that their job is to make society work, and spend endless hours creating and then solving pointless but amusing tasks for themselves, as well as arguing with their colleagues about minutiae. Many hackers, as a result, are completely entitled jerks, while the ones who aren't often find themselves unemployable. This phenomenon is sometimes called "culture fit".

I think the answer to the arrogance problem, in the case of software, is a concept known to nearly all public servants: working all the time. This phrase began with doctors, who work 36-hour shifts during their training. More generally, it means that you have one kind of work you do for money, and you do the same kind of work for love because you have no fucking free time in which to do anything else.

Nearly all public servants work long hours. Firefighters and nurses notoriously do. If you're lucky, you have low sleep requirements or access to stimulants. Firefighters often seem to sleep at work. So do hackers.

When I say that the answer is for hackers to work during the day, and work some more at night, I'm not proposing this as a new idea. This is what open-source hacking is all about. What I'm saying is that open-source is probably the right model, because working all the time because no one else is around to do it has been independently confirmed by all the other public service fields.

It seems surprising to me that any employer would be reluctant to let hackers work on open-source projects. At my company, we would have been reluctant to hire anyone who didn't. When we interviewed programmers, the main thing we cared about was what kind of software they wrote in their spare time. You can't do anything really well unless it's a thing that there is a clear social need for, and if there's a need, obviously you're going to do it all the time; anything else would be selfish.

Because hackers are public servants rather than scientists, the right place to look for metaphors is not in the sciences, but among other kinds of public service. What else can firefighting teach us about hacking?

An example we can take from firefighting is the way that fires are contained by gradual refinement. Fighting a fire usually begins by spraying a lot of water on it. Gradually, the fire gets contained so that there's a smaller area to attack. But it is not merely a process of containment. Sometimes, using water is a mistake. Countless fires turn out to have been caused by chemicals that react badly with water, or electricity, where the water would just act as a conductor.

Here's a case where we can learn from firefighting. I think hacking should work this way too. It's unrealistic to expect that every problem can be solved by spraying copious amounts of water on your data center. You're better off if you admit this up front, and write programs in a way that encapsulates failure using abstraction and modularity. That way, you need only push one server off a cliff when something goes wrong, rather than admitting total defeat.

Everyone by now presumably knows about the danger of premature optimization. I think we should be just as worried about premature design: deciding too early what a program should do.

The right tools can help us avoid this danger. A good programming language should, like a fire hose, make it easy to isolate failures (like fires) before they spread to an entire system. Fine-grained concurrency is a win here because individual tasks can be killed without taking the entire system down. But the key to encapsulation, I think, is to design a type system for the language and prove that it is sound with respect with the language's semantics. The easiest program to change is one that is correct.

This sounds ridiculous, but to be great, firefighters have to think about standardization and formal specification. For example, the Great Baltimore Fire of 1904 raged as long as it did because, among other reasons, American cities at the time had more than 600 variations and sizes of hose couplings. Before then, many firefighters might have thought, who cares? We don't all have to fight fires the same way. But that meant that fire crews arriving from Philadelphia and Washington, DC to help were not able to connect their hoses to Baltimore's fire hydrants, and as a result, most of downtown Baltimore was destroyed.

Standards and clear specifications win because in emergencies, edge cases turn into catastrophes. Now, we take it for granted that if our building catches on fire, when the firefighters arrive, they will be able to connect our hoses to the water supply. In programming, though, none of the most popular programming languages even have a formal semantics. Imagine if every fire hydrant worked differently and firefighters had to learn by experimentation how each one worked before actually putting out the fire.

Great software, likewise, requires fanatic devotion to formalism. If you look inside good software, you'll see that the meaning of every data type, function, and method is clearly specified -- whether in prose in a comment, or in the form of a type signature. I'm not claiming I write great software, but I know when it comes to code, I ask "what does that actually mean?" in a way that would result in me having no friends if I did that in conversations. There's little I hate more than seeing code that relies on undefined behavior for its meaning, or circumvents the type system for claims of "better performance" when there is no stated performance goal in the first place.

If a hacker were a mere artist, making things to amuse emself, then ey could just fiddle around all day with what looks "beautiful" to eir, like someone painting a mural. But if the hacker's job is to prevent disasters, we have to take logic into account.

In hacking, like firefighting, work comes in cycles. Sometimes, a major earthquake happens and you have to fight fires for 72 hours straight. Other times, you almost hope some youngster decides to take up arson or SQL injection, because you're so bored.

To do good work you have to take these cycles into account, because they're affected by how you react to them. When you're driving a car with a manual transmission on a hill, you have to back off the clutch sometimes to avoid stalling. Backing off can likewise prevent ambition from stalling. In both firefighting and hacking there are some tasks that are terrifyingly ambitious, and others that are comfortingly routine. It's a good idea to put your personal issues aside and be ready for anything, because the world is depending on you and you don't get to choose what's on fire right now.

In hacking, sometimes you get lucky enough to be presented with a problem you know exactly how to solve. Often, these kinds of problems are simply a matter of a person misunderstanding how the software is supposed to work. That's the one time that hacking is as straightforward as people think it is. The only problem is for you to improve the documentation that users use to help themselves figure out how to use your software. Since you know what your software should do, documenting it is effortless. It's just a matter of expressing in words what you already know internally. It's as relaxing as going to a class of third graders to give them a lecture on how to prevent fires and then let them climb around on the fire truck.

The example of firefighting can teach us not only how to manage our own work, but how to work together. A lot of the great firefighting of the past is the work of multiple hands, and what all of those hands have in common is that none of them got credit for what they did. Do you know who put out the Great Baltimore Fire? Or the Great Chicago Fire? Or the fires that followed the San Francisco earthquake of 1906? No, and that sort of thing is the rule, and not the exception. If you hear about somebody putting out a fire by eirself and wanting credit for it, it's probably your neighbor who's really excited about eir new barbecue grill.

As far as I know, when firefighters worked together to fight a fire, they didn't spend hours arguing about what the best way was to control the burn while the house was reduced to ashes in front of them. They just got out the hose. It was common for the fire chief to lead the efforts to fight the biggest fires and for newer firefighters to help with putting out fires in small buildings like doghouses. But nobody's house has ever burned down because the firefighters were all standing out front arguing about what color they should paint the fire truck.

I think this is the right model for collaboration in software too. Don't be a douchebag. When a project is owned by three or four different people, none of whom is officially the leader and all of whom were once that kid in school who always has their hand up, it will end up completely unimplemented while everybody argues about whether lines should be 78 characters long or 90. After a while, anybody who works on this project and doesn't have a giant ego will feel like if they raise any question at all, they will feel as if they're stepping into a burning building. The right way to collaborate, I think, is for people to set their egos aside and clearly state rules for conduct and for who will resolve conflicts when they arise.

Like firefighting, most software engineering should happen in a way that saves people from pain and trouble, in a way that isn't motivated by the praise and adulation of those very same people. And so hackers, like firefighters, must be humble to do really great work. You have to be motivated to do your job well even though people only notice you when you've done it badly.

When I was a kid, I was always being told that I should do things because they were right and not because I would be rewarded for it. What this always meant in practice was that I would do what I thought I was right, and then I would be punished for it. After a while, I kept doing what I thought was right and I stopped caring about the consequences to me.

Boy, was I wrong. It turns out that when you can work together with other people and make compromises, you actually end up doing more of what's right -- together -- than when you singlemindedly insist that you and only you know the One True Way to do it. It doesn't mean having no backbone. Far from it. Listening to other people when they disagree with you doesn't imply that you'll always do what they want: in some situations, like the software industry, listening helps you figure out what problem they're really trying to solve instead of what they say they want changed, and address the root cause.

Most public servants serve humans. And to serve people, you have to understand what they need. Nearly all the greatest incidents of fighting fires involved fires where people lived or worked, for example, because people tend to care about saving other people's lives and property.

Willingness to listen is probably the single most important difference between a good hacker and a great one. Some hackers are quite smart, but are totally unwilling to either listen to others' points of view, or explain the reasoning behind their own. They may not be solipsists, but it's certainly hard to observe a functional difference. It's hard for such people to serve the public, which is what building secure and reliable software is all about, because they don't really care about the public.

One way to tell how good people are at listening is to watch them have a conversation with someone they don't know well about a topic on which they disagree. We probably all know people who, though otherwise, smart, are just comically bad at this. If someone starts a conversation with them at a dinner party about, say, sexual harassment, and it turns out that this person's experiences with it differ from their own, you are likely to hear phrases like these come out of such a person's mouth: "You're just too sensitive." "I didn't say that." "I've never seen that happen." "It was just an isolated incident." "Stop being emotional." Too sensitive? Too emotional? Even if you're actually right, beginning by attacking your own conceptual model of somebody else's emotional state is a way to show that you are not listening to the content of their words.

Part of what software has to do is be reliable. So to write good software, you have to understand that users won't generally give a fuck about your bullshit excuses for why the server crashes every 15 minutes, and simply want your work to make their work easier to do instead of harder. So to do your job, you had better believe them when they say they're having a problem. The best system I've ever seen in this respect was an AT&T rotary dial telephone from the '60s. That phone did what phones today never do: if you dialed a number, the call would go through, and you wouldn't have to unplug the phone from the wall and plug it back in to get it to work.

Source code, too, should be reliable: in the sense that it means what it seems to mean. If I could get people to remember just one quote about programming, it would be this one from Edsger W. Dijkstra's essay "The Humble Programmer":
The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague.
Sexism aside, it's true: you need to collaborate not just with your fellow hackers, but with the future version of yourself. A way to do that is to be as precise as you possibly can, so that there will be as little ambiguity as possible when you try to understand your own code later. It's terrible, but people lose mental acuity every day, so being as clever as you possibly can now guarantees you won't understand your own code in a year.

So-called "cowboy coding" is associated with intelligence, to the point that there is even a fashion for it in some places. But I don't think there's any correlation. You can do well in math while being an overgrown baby who can't work with others without starting a screaming match, and math people get assumed to be smart, so the two qualities have come to be associated. But there are plenty of not-so-bright people who insist on doing everything their own way, regardless of how difficult it makes everyone else's life, too. Just look at the anti-vaccination movement.

So, if hacking works like firefighting and nursing, is it cool? After all, you only get one life. You might as well spend it working on something that will get you a ton of money and attention.

Unfortunately, the question is easy to answer: "No". Public servants are underappreciated and underpaid. These days, you can retire at 30 by starting a company that allows your cat to have a social networking profile, and yet it's deeply politically unpopular for anyone who works for a local or state government to be paid a living wage. Part of that is a consequence of the current political climate. But, I think, part of it is inherent: people simply do not appreciate the work of those who prevent the disasters they're unaware could have happened.

So while I admit that hacking is currently considered much more cool than firefighting, we should realize that in the future, hacking will probably be considered equally cool, or even less cool. That's because in the future, people are bound to realize that building and maintaining software is just another service that's essential to social functioning, not an act of heroism requiring extraordinary talent. Of course, I'm sure there will be the occasional exception to that. For example, firefighters had their day in the sun on September 11, 2001, and for years later, nobody could stop talking about how sexy they are. But remember that the presence of lust directed at you in others' hearts today doesn't cure the cancer you get later, or pay your health care bills when your employer refuses to.

What we can say with some confidence is that these are the glory days of hacking. In most fields, the great work is done early on. In medicine, it was a big deal just when nurses figured out that you should wash your hands in between treating patients, and an even bigger deal decades later when doctors realized that bacteria don't respect patriarchal authority and they therefore had to actually do what the nurses were suggesting all along.

Over and over, we see the same pattern. Hacking seems like an exception right now: when else in history have so many people been paid so much, and treated with so much deference, for work that essentially involves sitting quietly at a desk all day?

Firefighting has never achieved a level of coolness that is commensurate with its importance. In the future, hacking will be the same way. And that's how it should be. If you want to do essential work that maintains social infrastructure, and you would rather get carpal tunnel syndrome than run into a burning building, becoming a hacker is probably a good idea. But if you want people to remember you after you're dead, maybe you should become a painter or something.

[personal profile] corvi 2013-11-23 23:42 (UTC)(link)
All my problems would have been solved by spraying large amounts of water on the data center. :)
talia_et_alia: Photo of my short blue hair. (Default)

[personal profile] talia_et_alia 2013-11-24 01:30 (UTC)(link)
I have no idea what you're responding to, and I'm not going to click through now, but I laughed so hard I cried while reading this.
megpie71: Kerr Avon quote: Don't philosophise at me you electronic moron; answer the question (tech support)

[personal profile] megpie71 2013-11-24 22:00 (UTC)(link)
Oh gods. This is pretty much how I'm approaching programming anyway - I've spent far too long as first a user, and then as technical support, to ever be deluded that the core test of any program isn't whether or not the blasted thing works in context to do the job it's supposed to do. I don't care about how elegant the code is, or which fancy tricks it uses - if the blasted thing crashes three times out of every four I launch it, I'm going to call it a piece of crap, and call down curses on the head of the person who designed it.

However, I'd argue a good way to get the "firefighter" mindset in the coders would be to have them actually DOING the current fire-fighting equivalent in IT for a while: working in technical support, for a minimum of six months. Doing this would mean they got face-to-face with firstly, what happens when software malfunctions and how the users react to it; secondly, the most common ways people to mess up in using a program; thirdly, why listening to users is crucial when it comes to software design; and finally, genuine users they actually have to listen to in order to be paid.

At the moment, the people who have all the answers the programmers need are in technical support. Wouldn't it be great if the programmers spoke with them rather than with marketing?
genusshrike: David from Prometheus with the cube of human accomplishments. (david and the cube of human accomplishme)

here via network

[personal profile] genusshrike 2013-11-25 03:50 (UTC)(link)
As a public servant (the government department kind, not the frontline kind) who works with developers a lot, I enjoyed this :D
callmesquinky: A ninja playing the sousaphone. It's hard to be a ninja when you've got a sousaphone. Trust me, I've tried. (Default)

[personal profile] callmesquinky 2013-11-25 06:40 (UTC)(link)
Ah, this was a fun birthday present. Thank you! :)