When I first worked in an environment where multiple developers needed to release software, I found too many rules and constraints for smooth development. The process was developed by one person, or for one application, and somehow expanded to encompass all work being done in the organization. It was if we weren’t trying to think about our creative work as developers and DBAs, and instead viewed our systems composed of widgets built in a factory, each one a copy of the others, perhaps just a bit bigger or smaller. We used the same set of rules for the mission critical finance program as we did for the department vacation scheduling application.
While working inside an overreaching command-and-control mentality, the idea of moving faster or releasing software more often was seen as really dangerous. Two decades of Agile methodologies, and the rise of DevOps have started to change this for many companies. Each year the State of DevOps report seems to show more and more organizations finding ways to build better software, often by having their operations people embrace the concepts and processes used by software developers as a way of building and managing their environments.
Databases still lag behind, and as Donovan Brown says in this piece, if you’re not automating the back-end work of code deployments to the database, you’re still “faking” some of your DevOps process. Databases must maintain the state of data, even as transactions take place around code changes. Ensuring that we properly handle those data changes is a challenge. However, by treating the database changes as similarly as possible to application changes, we can minimize risk and learn database development techniques that help us push forward without being reckless.
Tools have gotten better for database development. Microsoft has some tools for SQL Server, and various other vendors such as Redgate have others. Good tools are essential, but each new one adds some complexity to the environment. That’s one reason why having a minimal number of tools and platforms reduces the friction of getting things done. I see this as a big reason SQL Server on Linux might take off. Plenty of companies don’t want Windows in their infrastructure when most of their systems are on Linux. Not that Windows is hard, but consistency makes everything easier for a staff.
This is the same reason why I’d say that it’s worth sticking with a database platform or two and not experimenting with each new type of system that comes out. If your staff knows SQL Server, then adding in MongoDB or Cassandra means there’s a learning curve, or a “tax” as the piece notes, to getting things done. This same tax gets paid with each new vendor, platform, language, or technology you take on.
Database work is hard, and once your staff gets good at building and deploying changes, you want to take advantage of their knowledge. For companies that have solid development and deployment practices, stick with the things that work well. However, if you have staff that turns over regularly and don’t have a mature process, you can reduce the “tax” you pay for database development. There are companies that have paid some of that integration tax, that learning effort, and they’ve got tooling to help you build code better and faster. Just as I wouldn’t want to code my own build server, I don’t want to code my own database deployment tools. I could, but I’d rather spend my time solving problems and use the tooling that someone else has built to make my job easier, and provide a consistent coding experience for my developers.