This editorial was originally published on May 13, 2013. It is being re-run as Steve is on holiday.
I’m sure most of us would like to think that we write fairly efficient code. However the reality for many of us might be that we don’t actually know. Many of us use the same patterns and practices that we’ve been using for a long time, rarely changing. When we learn a new technique or find a different way of coding that works better, we tend to then use that method over, and over, and over, and over again.
I would guess that if many of us profiled our code, and examined the CPU and network bandwidth we consume, we might be surprised at what we find. CPU and network usage isn’t something we are often concerned about. We assume that we’ve bought a machine and we should be able to use as much of it as we can at any point in time. That’s not the best approach, but since we often have more hardware than we need for many processes, it works. It also explains why so many applications struggle as the load increases. They’re not coded efficiently.
If you’re going to work in the cloud, you better learn to code more efficiently, mostly because it costs money. If you think about your design, you can reduce the amount of resources you use. In the cloud this translates to less cost. In the on-premises world, this means better performance and higher scale. It also means less complaints and phone calls.
Scaling up an application can be hard, but much of the struggle comes from poorly coding your application in the beginning. Most of us have heard the saying that it takes less time to do it right the first time. That’s true in many situations, and it’s true for your application development. Learn to write more efficient code and use patterns that conserve resources. You’ll find your applications will run better, no matter what type of environment hosts them. If you’re not sure what patterns and practices work well, read an article or ask a question and find out what efficient techniques others use.