One of the things I’ve tried hard to do in database development situations if ensure I could easily refresh dev and test environments on demand. In a small startup, we wanted to be sure our weekly releases worked well, so if we found bugs in QA, we immediately filed a report with developers and once they viewed a repro, we refreshed QA with a fresh copy from production.
Not the best approach, but for a small database in the early 2000s when we were less concerned about data breaches, this worked well.
At the time I remember discussing this challenge with Andy during one of our SQL Server Central catchups. He had a similar issue, though for him, they needed to clean the data. Their system included a bunch of email notifications and they couldn’t take the chance of sending out test emails. They also were in a regulated industry and clients who were concerned about developers getting names and addresses from production.
Cleaning up names and addresses seems like a simple task, but there were endless variations and a new set of edge cases appearing constantly. It was a regular task for Andy to maintain and adjust his scripts to ensure data was masked well. This also resulted in no shortage of calls from others when things didn’t run smoothly.
A Time Sink
For smaller organizations (50-200 people), it wasn’t, and likely still isn’t, an option to purchase some of the more established tools in this area. They are too expensive and require a lot of resources (time and hardware) to get working.
At the same time, the DIY approach is essentially a commitment to a software development project, one that never ends and distracts people from their regular jobs. If you have staff on salary, this can seem like a good approach, but it’s often a waste of their efforts.
Even in the age of AI, I can see how this would be something that eats up sizable amounts of resources. While AI makes coding easier, directing that coding isn’t easy. And since models only keep limited context, I can see someone spending just as much time directing a model with prompts and correcting its mistakes as they might spend writing the code. I might be wrong, but since this isn’t always an easily defined task, I bet I’d spend a decent amount of time, even with Claude Code, constantly reshaping masking scripts.
Not to mention, I’d still be hoarding the knowledge in my head about how to direct the AI.
A Better Approach
I work for Redgate Software, and certainly I’m a bit biased here. We sell a solution in this area, but I’ve also helped shape (a little) how we approach this space based on the challenges I see from customers, and the needs they have to get a system working quickly. Not to mention an affordable solution that reduces the risk of accidental data loss or regulatory fines.
We’ve developed Test Data Manager to work within the constraints of small to medium sized organizations, both with functionality and price. It has a lot of what I want in a solution, though not everything. I still push on the product and engineering teams to add more features as well as reduce complexity wherever possible.
I want this to be ingeniously simple to use.
I’ve had the opportunity to work with a few customers that have become audit ready in hours by using the smart defaults and adding a bit of their knowledge about the system. The time to value keeps getting lower and I’m impressed by how the team responds to customers requests and demands. Like many of our products, we’re releasing regularly and adding features constantly.
The approach of having a tool that codifies what you want, is easily version controllable, and gets updated regularly is what most people want from software. We try to be good partners, and we’re working to ensure that customers not only get the value for the price they pay, but that value continues to increase throughout the year as we mature the software.
We’re releasing in a DevOps manner, to ensure you can do so for your organization.
If you’d like to see how Test Data Manager can can keep you in control of your databases, reduce your risk of data loss, and help ensure compliance, give us a try .
We also have a webinar (
Compliance Without Compromise: Test Data Management That Finally Fits) coming up on Mar 18 that you might check out for a quick look at some of the benefits of TDM.

