Table of Contents
The other day I read What I Wish I Knew When I Started My Career as a Software Developer, an article in Lifehacker adapted from a Quora answer by Michael O. Church. I haven't stopped thinking about it since. I guess I find it so thought-provoking because, while I agree with practically all of his advice, it is so different from what I would distill from my twenty-odd years of experience as a professional software developer.
Most obviously, Michael urges young programmers deciding how much effort to put into their work to "ebb towards underperformance". He goes on to impart a wealth of useful tips for navigating corporate politics and coming out on top. I'm not naive enough to think that managing up and general schmoozing ability aren't important to achieving professional success. But we are blessed to live in the age of the Alpha Nerd, where software developers (at least good ones) are a scarce and highly marketable commodity. If you don't feel like your current job allows you to get ahead through hard work and sheer coding prowess, don't respond with Machiavellian scheming. Just change jobs.
More to the point, however, is that only one of Michael's fourteen points ("Recognize core technological trends apart from fluff") is specific to software development at all. The rest are good general career advice whatever your line of work. The implication of the article's title ("...as a software developer") is that it should be somehow specific to our industry, and that's how I would approach the question.
When I was a teenager, I thought I was a truly great programmer. I could sit down and bang out a few hundred lines of C or Pascal and have them compile and run as expected on the first try. I associated greatness in software development with an ability to solve hard technical problems and a natural, almost instinctive sense of how to translate my solutions into code.
Even more than that, I associated it with speed. In college I was proud that I could spend weeks drinking beer and playing foozball instead of working on a computer science assignment, then bang it out over a frenzied weekend of all-night hacking and copious coffee consumption. That proved I was a great programmer, right?
It wasn't until after university that cracks in my self-satisfaction started to show. I got a job at a small translation company in a Parisian suburb as the sole developer writing a rather too ambitious terminology management system. Although I created what was doubtless a fairly imposing prototype, the product never really made it into production. Looking back now, it is easy to see that it was too complex a project for the hack-and-slash programming style of my 23-year-old self. (The dreaded Windows 3.1 "General Protection Fault" error made such frequent appearances that my boss, the company's CEO, took to hailing me with a sardonic "Salut, mon Général !" when he passed me in the hallway.)
I'd like to say that the subsequent twenty years have been a voyage of discovery, as each successive project has taught me a bit more about the importance of getting the architecture right, writing lots of tests and documenting the hell out of the whole thing. The truth is that this realization has come to me fairly recently, accumulated over two decades of projects, some quite successful technically, that never entirely felt to me like great software engineering.
The most striking lesson I have learned is that timeframes that seem absurdly bloated beforehand tend to look reasonable, even respectable, at the end of a project. Countless times over the course of my career, my colleagues and I have agreed that a project couldn't possibly take more than, say, three months. Then it ends up taking nine months or twelve months or more (sometimes much more). But looking back, with all the details and pitfalls and tangents in plain sight, you can see that that the initial estimate was hopelessly unrealistic. If anything, racing to meet it only served to slow the project down.
The corollary is that the market rarely moves as fast as you expect. With rare exceptions, if you create something with a solid foundation that is usable, maintainable and meets a real need, it will be as relevant when you finally bring it to market as it was when you came up with the idea, even if it took you much longer than you anticipated. In my experience, projects fail far more often because the software never really works reliably than because they miss a tight market window.
And yet the vast majority of developers still seem relunctant to spend time on unit tests, design documentation and code reviews (something I am particularly passionate about). There is a widespread feeling that our job is about writing code, and that anything else is a productivity-killing distraction.
This isn't to say that we should craft technical specs in minute detail before we write a line of code. I'm a great believer in agile, and bashing out something that works as fast as you can is a great way to start most projects. But we need to be willing to take the time to refactor frequently and extensively, taking into account the lessons picked up on the first pass through the code. We need to work on tests and documentation as we go, not leave them to an elusive future Shangri-La when the stress to crank out features has subsided. We need to recognize that our job isn't about producing more code in less time, it's about creating software that is stable, performant, maintainable and understandable (to you or someone else, a few months or years down the road).
So anyway, that's what I wish I knew when I started out in this business: slow the fuck down.