Despite my best intentions, I've neglected this blog. I've been spending whatever spare time I've had on my other non-techie blogs and frankly haven't been taken with anything on the technology front that's inspired me much lately.
One thing I've noticed that certainly isn't inspiring has nonetheless been on my mind lately: thinking, or rather the lack of it. It has been an unfortunate constant to greater or lesser degree over the quarter century I've been making a living writing software: we too often as both developers and managers forget to include in time estimations what might well be the most important thing: time to think.
Estimates might include the time it takes to "design" (usually, the time to type up the design or sketch it out), the time to type in the code and tests, time to debug, refactor, adjust, and time for documentation (user doc, examples, etc.). Each of these activities as described are necessary things but are in the end essentially documenting the results of thinking. The code documents your thinking so that the machine understands and can execute it, the debugger helps you figure out where your thinking has gone wrong (so do, obviously, your tests). It seems to me, though, that the actual thinking itself is never raised as a formal activity. Ideally, it shouldn't need to be (it should be assumed as the primary driver for everything else). But in my experience that isn't always the case. All the rote things you have to do take time as well, and often if something doesn't fit into that vein, it's not considered. Of course, you still have to do the thinking, but because no time has been allocated for it, it often gets rushed and relegated to a supporting part where it really ought to be the star. Maybe it's just me, but I have a feeling in this case, it's not.
Of course, this malady has been noted over the years by a number of folks. Check out Andy Hunt's Pragmatic Thinking and Learning for a great read that touches on this (among many other things).