I saw a script from a user recently that looked like this:
Create db x on file = go alter db set compatbility = 120 go alter db set containment = none go ... alter db set service_broker off go
That’s interesting because most of the code I see looks more like this
create database x; go
That’s it. At first I thought this was overkill, but then I wondered. If we are actually creating a new database for an application, shouldn’t we set the options the way that our application needs them rather than rely on defaults? I know that often many of us work with defaults on our instances and databases, but should we expect all configuration options to just be at their defaults? I know I certainly have been burned in the past with default changes.
I’ve seen similar scripts for code, with SET ANSI_NULLS and other settings in the script. While I sometimes think that code gets in the way, I know that the script contains the settings that work for the code I’ve written and tested. If I remove those settings, it’s entirely possible that my code might not work. Do you know all the ways in which the various SET parameters for code affect your code? I’m sure that many of us don’t think about these changes when we’re writing code, assuming that the dev, test, and production servers will be the same as our database. In most cases they are, but not always.
There’s a trend towards explicitly declaring the settings for an environment and then letting the system ensure that it maintains those settings when it’s built, or perhaps every so often. If that’s the case, and someone makes a mistake, say on ANSI_PADDING, do we really want to assume none of our code was compiled with those settings? I’m not sure I do, and I suspect that we should be explicitly putting all our settings at the top of scripts to ensure code behaves as we expect.