Getting it Done
There is an unfortunate side-effect on our source code due to the mindset of corporate America. Americans like to jump right in and get going. They like to "just do it". This was made crystal clear to me by a German acquaintance who noted the difference between the way things work here and in Germany. He said Germans prefer to do a lot of up front analysis before they do any actual work. Here, people just dive right in.
Getting things done certainly isn't a bad thing. This is a country of entrepreneurs. And the first step to accomplishing your goals is to start, right? Businesses need people who can get things done and done fast. But there comes a point at which you have to stop doing and start thinking. The Just-Get-It-Done mentality can have harsh consequences when applied to programming. It's harming our code, and our children's code, and possibly generations of code to come.
What do I mean by this? I don't mean programmers are racing through the development process and forgetting to check memory allocation for NULL pointers. Although this would be really bad, I suspect this occurs more often due to carelessness than time constraints. What I do mean is that we don't get a chance to breath, to think about our code, to meta think about our code, and think about the process.
When we're in school, those nasty professors who hate us try and push us a little further to understand advanced concepts. It isn't that they necessarily expect us to apply the concepts we learned to the problems they gave us or even to apply them at all. At the bachelors level you were given at least enough exposure so that you can learn more on your own. Hopefully, you were seeded with at least enough curiosity to ask yourself in the future: what am I doing and how can I do it better?
To some degree, we're all self-taught. Most of what I know about software and computer science, I learned on my own. If you've been working with computers for 20 years, then regardless of your schooling, you're self taught as well, unless you've been doing the exact same thing over and over for all that time. But learning stops according to the needs of the company and the demands of the customer. If you are under the influence of the Just-Get-It-Done mob, you will stop learning at exactly the point at which you know the bare minimum required to complete a task. Modern software technologies often offer a myriad of ways to do a task. In addition to that is the incredible level of abstraction we've reached, which affords us great power and hide's as much danger. Thus, to learn just enough to complete a given task usually requires learning very little.
But the ability to learn very little, yet accomplish much, is a false hope. If you only understand very little, then you are probably (1) consistently doing things wrong, and (2) doing them right but in a consistently really bad way. An example of the first case would be not freeing memory because you didn't realize you need to be doing that. In the second case you might keep running 'make clean' before 'make all' because you completely misunderstood makefiles.
With source code, your code might be right and efficient but the process wrong. Maybe you wait too long to debug, or the builds aren't frequent enough, or you don't use version control or unit tests, or you haven't taken the time to learn a far superior tool or language. The code may also simply be structured in a less than ideal manner even though it works fine.
You don't have to fix any of these "problems". Perhaps you're happy to keep programming applications in assembly. But I would be surprised if your company is still around today. Your company may decide there isn't time for you to learn about subversion or makefiles or the smart way to code. So you'll just keep on writing code the not-so-good way over and over and over.
You might think there are some ways around this. One solution, for a large corporation at least, would be to have one or two people be internal improvement experts. Their job would be to study everyone's code, study solutions to common problems, and train everyone to do things the right way. While this is probably a good idea, it might not work in practice. You can try to briefly train a programmer to Just-Get-It-Done right, but unless they have had the same experiences as you and have gone through the learning process, they might not be able to spot the difference between the good and bad way. That is, you might be able to quickly teach them how to fix the bad code, but that doesn't mean they'll know when to apply it. Each person needs to learn on his own. And this means taking the time to think about what you're doing and how it could be done better. It means "unproductive" time.
In my experience, it's better to "waste" the whole week and do the work correctly on Friday than to do it wrong all week. By waste I don't mean do nothing. You might have spent that time experimenting with some prototypes, or learning something new, or experimenting in some other way. But there is a learning process that goes on. It may slow down the current project. But future projects will benefit. In the future, you'll do things the right way, and you'll be able to quickly spot when to apply which methods. By taking the time to do a little analysis, to learn just a little, you could save yourself enormous amounts of time and headaches later down the road.