I was reading a bit about the design and planning stage for a new database that will support some random application. The piece isn’t important, but I did notice one thing. The guidelines are generic, encompassing most of the things we might want to think about: structure of data, constraints, volumes of data and rate of growth, and more. It’s a good list, and one many of you have probably considered at some point if you’ve designed a system.
However, I’ve rarely been able to actually do more than guess at many of the answers. Even much of the data I can write down is often little more than a guess. Even when I know the DRI constraints, I hesitate to document them separately from the code itself. Mostly because I find designs change early and often, and the documentation is rarely able to keep up with code. The same thing applies with the administrative type decisions I make.
I’m also nervous about documenting guesses. I find that far too often any answer I give takes on some level of truth with the developers, often being considered an immutable fact. The assumption that an early guess will match reality later seems to often blind everyone and limits the level to which we can adapt to changing situations.
As an example, if I tell a SAN admin that we think we need 5GB of space, but we find that we need 20GB in a short period of time, I meet resistance. A debate, or argument ensues, we search for evidence that we need to change, and a very inefficient process ensues. The same thing occurs when developers, security admins, and especially managers hold too tightly to estimates and guesses.
Certainly some guidelines are needed, but when I’m not sure, I tend to give wide ranges, just to prevent too many expectations from taking hold in people’s minds. We should know that our plans will rarely survive production deployments and workloads, but somehow we forget that when looking at the guesses we make early on.