A Note and a Message: T-SQL Tuesday #193

This month Mike Walsh hosts T-SQL Tuesday. It’s been quite some time since he hosted (back at #4), but he answered my call for hosts and I appreciate that. He has a really good end of year invitation that asks us to look to the past and the future.

I’m trying to keep T-SQL Tuesday alive, but I’ve been struggling for hosts. I need more of you to host, and more people to blog.

Blogging is incredible for your career. Take some time to get started, and if you do nothing but respond to these monthly blog parties, you’ll look better than most other candidates for jobs.

If you want to host, ping me on X or at sjones at sqlservercentral dot ***.

A Message to Steve in 2015

The world seems to be changing quite quickly in in our lives. You bought a new Pebble watch this year, database automation is becoming a bigger part of your job, and there’s lots of data analysis you can do.

Power Bi is a tool from Microsoft that is going to grow in popularity and it’s a place you ought to invest. New places to store data, new features in SQL Server, new platforms for building analytics will appear, but Power BI is a place you should spend more time in and improve your DAX skills.

Learning to build better visualizations, especially for your volleyball coaching (which is going to continue for some time), is an investment that will really pay off in the future.

A Word from the Future in 2035

Steve,

we spent a lot of time early in our career working with networks, helping manage accounts, users, security, privileges, and more. That knowledge and experience helped us understand SQL Server and other database platforms as well from the security standpoint.

Starting in 2026, you need to revisit some of those skills, ensure you better understand how authorization and authentication works in the context of AI and agents. AI technology is changing the world, and it’s going to be heavily used by agents, but security and ensuring there are limits placed on them is going to define who has more success with AI and who has more trouble.

Take the time to ensure that as you engage with MCP servers, task agents with work, and evolve the way you use AI that you include governance in your efforts to protect yourself from runaway agents that might go beyond what you expect, or spend more money than you want them to spend.

Enjoy the next decade, it’s going to be amazing.

Posted in Blog | Tagged , | Leave a comment

Your Security Checkup

Recently I saw an article on Simple Talk, 15 Practical Tips for Securing SQL Server, and I thought that many of these are fairly simple things. Turn off unused features, disable sa, etc. These are things that a lot of people probably ensure are in their SQL Servers builds.

Though, I’m sure a lot of people don’t bother.

Often, I’ve found that different people might be responsible for setting up servers, or they might have rights to change things on existing servers. Over time, what we thought of as a standard often isn’t standard on all instances. Exceptions creep in, perhaps because developers change things when they don’t know better or aren’t thinking of security. Vendor software might have some  unexpected requirements for similar reasons that deviate from our standard. We also might change our own standards over time and forget to revisit existing servers.

I wonder how many of you have a security audit procedure in place to re-examine your existing servers. It’s something that ought to be done periodically, like storage management. It isn’t needed every day or week, but a few times a year you might want to ensure things are set appropriately and ready for the next few months.

I’ve been surprised at the number of people that really like the Redgate Monitor Configuration page to keep track of their servers and the deviations their own standard config. It’s also been interesting how many people upgrade to the Enterprise Edition to get the Security features. Tracking these over time can be a pain DBAs want an easy way to do this. In fact, there are so many feature requests for enhancements to security tracking that the devs on RGMEE are very busy.

It’s getting to be the end of the year, and that’s a slower time for many of us. Unless your business is related to the holidays, a lot of people take vacation, we have code freezes, and there’s a little more time for housekeeping. This might be a good time to conduct a little security audit and ensure that your servers aren’t open for attack or making it easy for malicious actors, or naïve but well-intentioned coworkers, to get into systems.

DBATools is a great way to do some changing or enforcing of standards across lots of servers. Even if you can’t change all the settings, you might ensure you have documentation on why that one instance has an sa account enabled. You might also ensure that your security people have signed off on any exceptions.

If nothing else, a good security checkup should include checking your versions and getting up to date on patches.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged , | Leave a comment

Flyway Tips: Automation Assistance in Flyway Desktop

I was chatting with the product managers at Flyway and one asked me whether I’d seen the new tab for Automation in Flyway Desktop. I hadn’t and decided to take a quick look at how this works and what’s useful. This post looks at the new feature.

Tl;Dr this is a good way to start learning how to move to a more DevOps, automated way of deploying changes.

I’ve been working with Flyway and Flyway Desktop for work more and more as we transition from older SSMS plugins to the standalone tool. This series looks at some tips I’ve gotten along the way.

Working with Flyway Desktop

For a lot of customers, it’s not too hard to setup a project and start to capture code in a Git repo. However, adding in automation gets challenging for many, especially as the docs are hard to understand if you don’t already have some knowledge. I find that CI/CD is a bit of a chicken and egg challenge as we try to learn to get better, but knowing what to learn and do requires knowledge.

Which we don’t have.

In any case, I’ve taken an existing Flyway project where I am capturing some code. You can see the project below, with objects on the right. The database and the repo have the same object code and are in sync.

2025-11_line0127

Let’s add a few new objects to this database in SSMS. You can see below I’m adding a new table and altering a proc. I’m also refactoring slightly to not keep the old style join convention in my proc as I add a new join.

2025-11_line0129

Once I do this, in Flyway Desktop (FWD), I see my changes. I’ll save these to the repo and then generate a migration script. I won’t show that as it’s not important.

2025-11_line0130

Once I’ve done this, I know I want to deploy code. If I go to Migrations, I can see I have these two scripts ready to go to QA, and I can deploy them with FWD manually. However, I don’t want to do that.

2025-11_line0131

I want some automation. How does FWD make that easy? Let’s see.

The Automation Tab

There’s a new tab on the left, which is the Automation tab.  I’ve expanded the left menu out and you can see it, but it’s a lightning bolt, which I might never have noticed (hence this post).

2025-11_line0132

If I click this, I get a little explanation at the top, a few links, and then some CLI based code. This last part is the important part of what I need to move to CI/CD.

2025-11_line0133

If you look at the code closely, you’ll see some placeholders in angle brackets for the environments I need. The code looks like this:

2025-11_line0134

Above this, there are two drop downs where I can select my build and target environments. Build is the CI portion, and it’s a good idea to have a place to build code separate from QA. This lets me validate things, and more importantly, run some code analysis checks, summarize changes, and detect drift.

I’ve expanded the Build drop down below and you can see I have my environments listed, and I can also manage them from here. There is also an Environments tab on the left menu just above Automate Deployments that looks like a database icon.

2025-11_line0135

Once I select an environment, the code changes to reflect this. That makes things easier to automate as I can take these commands and drop them into a task in my CI/CD tool.Note in the image below that I’ve selected NWInt and the code shows target2. The code needs the ID (or PK) of the environment, but the display name I entered is shown in the drop down. Trust me, these match.

2025-11_line0136

I’ll also select a target and then open up a CLI to run the first item on line 5. I did have to auth, but this runs successfully. No doc checks, no experimenting, the command ran.

2025-12_0116

I’ve never saved any snapshots and so the drift doesn’t work right away, but I’ll get the dryrun script. When I run this, I get a summary from the CLI.

2025-12_0120

Then I can see the report (see above output for the path at the end. When this opens, on the Dry Run tab, I see my scripts.

2025-12_0121

Note, I did use the drop downs in FWD to select the environments I need for this.

Very cool.

Summary

One of the challenges with using Flyway is that there are a lot of settings, options, and more that one must learn to take advantage of the solution. Our docs continue to improve, but even for someone that has been using Flyway for a few years, I have to constantly check out things work. Plus, the teams are adding features on a regular basis, so it can get confusing to learn the new things.

This automation tab is really helpful to shortcut some of the things I need for the various environments in my project. I’m going to start using some of this in a new project, and so should you.

Flyway can do much more, and for smoother automation, check out Flyway Enterprise.

Flyway is an incredible way of deploying changes from one database to another, and now includes both migration-based and state-based deployments. You get the flexibility you need to control database changes in your environment. If you’ve never used it, give it a try today. It works for SQL Server, Oracle, PostgreSQL and nearly 50 other platforms.

Video Walkthrough

 

Posted in Blog | Tagged , , , | Leave a comment

How Important Are Real Time Decisions?

Imagine a perfect world? I have an AI agent that knows my business well. It’s getting real time input from sales, from customers, it makes amazing decisions. We get a large order? We need to ramp up production of our widgets. We have an order pipeline of xx widgets and we know over time that yy% will close. Let’s place a larger order with a supplier overseas.

The next day, we have an election and tariffs are announced on imported parts. We react immediately, cancel the order, start the process to expand a local factory. We place ads to hire workers and order equipment. Things are looking good for our business and our factory will be up and running in a few months.

The next week we find out the tariffs weren’t really being enforced, so they’re paused. Our AI agent re-places our large order for imported parts and tried to cancel the factory expansion. Of course, it calculates the costs of both sides before deciding, and perhaps consults with me on other uses of our local factory.

How many times can we do this? Or rather, how many times would we let an AI agent keep adjusting our business?

To be fair, humans might do the same thing and over-react, but mostly we become hesitant with unexpected news. That slowness can be an asset. We often need time to think and come to a decision. Lots of our decisions aren’t always based on hard facts, and a lot of business isn’t necessarily fact driven either. We often put our thumb on the scales when making decisions because there isn’t a clear path based on just data.

Things can get worse when we collaborate. I used to run real-time reports for an importing company, and we found that executives would print a report, get busy, and after minutes (or hours), discuss the report with someone in a department. However, their numbers rarely matched because the reports were printed at different times. At first they lost trust in the system because the same report on the same day had different numbers. Even when we added a “print” or an “as of” time, the reports were too annoying to users to be helpful because the numbers didn’t match.

Real time isn’t what most of us want. Except in the Olympics. There we want the photo finish right away.

But not in all sports. A review is good. In the NFL, I’ve come to like instant reply. It’s gotten better/faster and often gives us the right answer. Not always, but often. It’s better, arguably, then just real-time humans.

Real-time decisions and reactions can be good in some cases. Adjusting machinery, vehicles, electricity, etc. where we need too-quick-for-humans decisions based on data is a good place for real time data. Lots of business decisions we make aren’t the places where we really need real-time insights. Our human brains just don’t work that fast.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged | 3 Comments