Expensive CPUs

There have been a lot of features added to the SQL Server platform over the years. Several of these features let us perform functions that are beyond what a database has traditionally been designed to handle. SQL Server has had the ability to send emailsexecute Python/R/etc. code, and in SQL Server 2025, we can call REST endpoints.

Quite a few of these features (arguably) are more application-oriented than database-oriented. There’s nothing inherently wrong with having a server perform some of these functions, and there have been some very creative implementations using these features. I recently ran into one of these examples from Amy Abel, where she shows how to use the new REST endpoint feature to call an AI LLM to generate and send emails from your database server. That’s creative, and it’s reminiscent of the numerous examples from various experts over the years who demonstrate how these features can be used to accomplish a task.

However, these are examples. They work amazingly well with one user running a limited workload. This reminds me of many of the examples I’ve seen using the AI vector enhancement in SQL Server 2025 to create embeddings from string data using an LLM. That is interesting, but most of the examples show a trigger being used to update the encodings. Imagine users updating data and those triggers firing. Imagine a real workload and how often your users might update string data you want to use in an AI application, especially a RAG application. Think about how complex or long-running triggers in your applications now that can overload your system.

CPUs in database servers are expensive. The hardware isn’t more expensive, but the software is pricey. Standard Edition is limited to 24 or 32 cores (depending on version), and while Enterprise isn’t limited to any number of cores, the cost of each core is $$$$. Is it worth having those $$$$ cores sending emails or calling external services? Or would you be better offloading those calls to another server, like an app server, where the cost of the core is the hardware and a little .NET code running separately?

Many of us already struggle with the database server as a bottleneck for our application and workload. Scaling up our database systems is expensive and cumbersome. We struggle to get approval for larger VMs, and if we scale up in the cloud, it gets very expensive very fast. I’m not surprised that database vendors are happy to add these features as it increases the licensing cost for applications using them.

I know the majority of the cost of building applications is labor and software developers’ time. However, that’s changing with LLMs that can produce code cheaply. I keep seeing that the cost of writing code is approaching zero.

That’s not going to be true if you use LLMs trained on the example code that increases your database licensing cost. Then the cost might be higher than you expect.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged , | Leave a comment

Claude Code Helps Analyze Test Data Manager Log Files

I had a customer ask about analyzing their Test Data Manager (TDM) usage to determine how many people were protecting data in dev databases and how often. TDM creates log files each time it runs, but analyzing those isn’t easy.

Or is it? Claude Code thinks it is.

This is part of a series of experiments with AI systems.

Consolidating Log Files

I didn’t want to grant extra rights or permissions, and I like the loosely-coupled idea of tools, so I manually copied over log files from a couple locations into a new repo spot. This was a test to see if Claude Code could help me with something I’ve done in past with other types of files.

TDM puts log files in Program Data, which can be locked down on many machines, so I copied things over to a location on another drive. In this case, I created a repo with a folder called TDM. In there I copied the Anonymization and Subsetter folders from Program Files. You can see the Anonymization folder contents below.

2026-01_0212

I opened a few random files to see what was inside and had an idea of what I’d look for in here if I were doing the analysis. I used this to help me craft a prompt.

With that in mind, I have a good place to start, so let’s go see what my assistant, Claude, can do.

A Quick Prompt

I ran Claude from the TDM folder with this prompt: I have a number of log files in the subfolders here. What I am looking for is a summary of how much activity is in those log files, by month and year. The anonymization log files
have “started” records of an “inf” type fro the classifying, the mapping, and the anonmyizing. The subsetter has “subsetter started” . Build a script that can analyze this data and
produce summaries for the activities

Claude agreed to help me with this response. It agrees to help summarize the data and then starts to read the files in the sub folders, looking at a sample of each. I

2026-01_0213

Claude then notes it understands the structure and starts to write code.

2026-01_0214

I have no idea how long this took as I left it alone and went to work on other things. I was finishing another piece, so a few times I approved actions to re-read files and execute the script. I know that it was less than 10 minutes, but could have been much quicker.

When I saw the script had completed, I also had a summary. I’ll paste in the text below to make it easier to read.

Claude’s Summary

Perfect! I’ve created a Python script that analyzes your log files and produces a summary of activities by month and year. Here’s what the script does:

  Script Features

  The script (analyze_logs.py) analyzes:

  Anonymization logs:
   – Classify – Tracks “Classifying database” activities (task=”classify”)
   – Map – Tracks “Creating masking file” activities (task=”map”)
   – Mask – Tracks “Masking database” activities (task=”mask”)

  Subsetter logs:
   – Subsetter – Tracks “Subsetter started” activities

  Summary Results

  Your current logs show:
   – 163 total activities across both tools
   – Activity from February 2024 to February 2025
   – Peak activity in January 2025 with 69 activities (10 classify, 4 map, 8 mask, 47 subsetter runs)
   – 2025 is on track to exceed 2024 activity (88 vs 75 in the first 2 months)

  Usage

  Simply run:
   python analyze_logs.py

  The script automatically scans the Anonymization/ and Subsetter/ subdirectories and provides:
   1. Monthly breakdown of all activity types
   2. Yearly summary totals
   3. Grand totals across all time periods

  The script handles both structured JSON logs and readable text logs, making it robust against different log formats.

Leading Me On

At this point, I was ready to commit and save this code, but I noticed that there was a grayed out prompt already added for me at the CLI. It said: Export the summary to CSV file

I hit Enter, and then Claude proceeded from here. I’ll show the prompt and then how Claude started to edit it’s own work.

2026-01_0215

Once this completed, I approved the action to run the script again and saw these results. I got a summary of the changes, and then two CSV files with the results. I hadn’t committed code yet, so in the repo, you can see the files yourself. These are in my AIExperiments repo on GitHub: TDM folder in GitHub

I looked at the results and it looked pretty good to me. I don’t run TDM a lot on this machine, but the results seemed to match up with the dates of the log files. For example, there are log files for the anonymization (classify, map, mask) for Nov 2024, but they’re empty, so these are excluded from the csv summry.

2026-01_0216

What’s also neat is that the next pre-populated prompt asks if I want charts. I hadn’t thought about this, but I decided to just hit enter and see. Claude wrote a new file and then run it (I approved this) and had charts in a subfolder.

I re-ran the TDM process today (Jan 14) to see if things worked. I got a new log file in the subsetter folder and when I ran the python script manually, I see new executions.

2026-01_0226

Very cool.

Summary

This is a quick app that is useful and it helps not only me, but the customer. In fact, lots of TDM customers and our SEs might find this useful. This is the type of data analysis I’ve done in the past, trying to look through log files and parse them, which can be a tedious task. Claude Code, (or another AI) makes this easy to generate and get working.

Of course, this needs a bit more testing to ensure it’s accurate and I’m guessing there might be some edge cases or rough edges that need smoothing out but is that any different than an app I or someone else wrote? The app is only as good as my prompting and testing, but this was quick enough that I could either alter it myself, or let Claude do it.

After my first Claude Code session, I had USD$1.09 left of my $5. After this, I have $0.17. I’ll buy buying more credits or subscribing as I am finding Claude to be very useful.

Posted in Blog | Tagged , , | Leave a comment

PowerPoint to HTML with Claude AI

I had an idea for an animated view of a sales tool, and started to build this in PowerPoint. I decided to switch to Claude and ended up with an interactive one-page web app. This post looks at how I approached the process.

This is part of a series of experiments with AI systems.

Editing in PowerPoint

Someone sent me a slide in PPT and asked me for feedback. I wanted to adjust the look, and started to make changes in PPT. I was moving boxes, changing a few colors, and adding something that I thought would have more impact.

I was about to start building some animations to show how I envisioned things working when it occurred to me this would be a better as an interactive, more self-service item.

I opened Claude and this was my prompt. I uploaded an image of my slide.

2026-01_0265

In 3 minutes, I had a working one-page web app that I could click around in and see changes. I had one more prompt as my gauges weren’t working correctly.

2026-01_0271

I took another moment to fix the UI as it had 3 of my 5 items on one line and then I had to scroll to see the others. My last prompt fixed this.

2026-01_0272

After getting a slide asking for feedback from a colleague, and spending 10 minutes futzing in PPT, I switched to Claude and had a working, interactive showcase of what my mind was thinking,thanks to Claude (free Claude).

I sent it back and my colleague loves it. I don’t want to quite show this here, as we may use this in Redgate and someone will need to make it waaaaayyyy prettier than I can or even prettier than I can describe to the LLM.

How Useful is This?

You might say that going from an image or slide to an interactive picture isn’t useful. You might be right, and to be fair, I’ve never had the need to do this.

However, if I think a little wider about the idea of making something interactive or expressive, I have had the need for this type of thing before.

For example, I’ve needed a list of things that I want people to do and check them off. I’ve built this in word, with the little checkboxes, and given it to people. However, they have to remember to save as, or where it is, etc. Here, I could just give a URL to this in docs (or in code) and people could open it and live check boxes as they’re walking through a procedure. I could even have it change colors or notify when everything in a step is done.

Or display a warning if a step is skipped. How useful would that be?

I  have also had the need to ask a group of people a question and get their answers or ratings. How easy would it be to have a link I can send, detect the user (I’m sure AI can add this) and then record their vote, letting them override their decision at any time?

What about a page people can use to plan vacation or anything and send me back their decisions? This saves everything in a file.

What would your imagination let you build?

Posted in Blog | Tagged , , | Leave a comment

25 Years of SQL Server Central

The oldest article we have on the site is Tame Those Strings! Part 4 – Numeric Conversions, by me. It’s dated 2001-04-18, though I think that’s a date we picked when we converted all the content from one database to another. The founders agreed sometime during Feb 2001 to jointly run SQL Server Central. Since we each owned the copyright of our articles from another site, we migrated several articles to build up our content library. This was back when Andy, Brian, and I all had full-time jobs and managed the site during breaks, nights, and weekends.

That was 25 years ago.

Twenty. Five. Years.

It’s incredible to think that almost half my life has been spent working with this community. That joint effort morphed into a full-time job for me sometime in late 2003 or 2004. I took a pay cut to run the site, though as we grew from one to two to five to six newsletters a week, we started to make enough money to make up the difference. I was a horrible salesman, but fortunately, we had a great site that kept growing week after week, and we didn’t need to rely on my salesmanship. The site grew from dozens of users when we started to thousands in a few months to tens of thousands in a year. Eventually, we reached a million registered users, which was quite a milestone for us.

Apart from the site, we published books and gave out copies at our annual party during the PASS Summit. That party was one of the highlights of my year. We also used to publish a magazine in partnership with the PASS organization. That was a stressful time, with me trying to manage an every-other-month schedule for the magazine, which had to be laid out, printed, and shipped to subscribers. While that was going on I had to keep a couple of yearly book projects going and still get daily articles published.

I started writing these editorials because I was a little bored with the job. I never imagined how popular these pieces would become and how many people would read them. I suppose I should have as I was the one who negotiated and paid for our emailing software. We used to pay for Lyris Listmanager, which cost a few thousand dollars when we started. As we grew, we needed to send more emails overnight. One year I received a quote from Lyris for a few hundred thousand dollars to add the additional sending capacity. When I called the sales rep, he told me the only small companies sending more emails than us were the porn people. Needless to say, Andy took that as a challenge, not wanting to pay hundreds of thousands of dollars for email software.  we designed a system using an SMTP component that would let us send a lot more emails. At our peak, we were sending over 8 million emails a week.

I had to learn a lot about running this site, from SMTP tricks and the how CAN-SPAM act applies to negotiating advertising contracts with customers. I had to manage hosting locations in the early 2000s. We first rented a VM, but they were too small after about six months. We moved to the house of a friend of mine, where he had 3Mbps broadband connection (this was 2002). At the time, I only had an ISDN connection, which wouldn’t cut it. We migrated through a few different co-location facilities in the Denver area that I had worked with as a corporate employee. Those moves entailed me physically moving servers into cages (or partial cages) in cold rooms, re-configuring our switch and firewall, and ensuring everything connected to the Internet. I even had an account at Dell as we regularly upgraded hardware.

When we sold the site to Redgate, some of those hassles went away, and I could focus on just being the editor of the site. I no longer had hosting responsibilities or even coding ones. Things were good and bad with that change . Good as I had developers to whom I could send bugs, but bad in that they had other, higher priorities. In the last few years, I’ve struggled to get things enhanced or fixed on the site, though I’ve been promised that is changing this year.

Despite all the changes over the years, I’m still thrilled to be the editor of SQL Server Central and glad that Redgate continues to run and support the community. Most of my time is spent doing other work with Redgate, but managing this site continues to be a significant portion of my work week.

And I still enjoy it.

I want to thank everyone who has read an article, asked or answered a question, syndicated their blog, tried the Question of the Day, written an article, or just left a comment on a piece. This has been an amazing community where many of you learned to be a better data professional. Lots of you asked, debated, and shared your knowledge with others in an extremely neighborly way. It’s been a joy to see this community grow into one where we appreciate, value, and love each other. I’ve made many friends here, met many of you in person, and seen you get a value from this community that cannot be measured. The success of this community is because of all of you.

I’m blessed to have joined you here for 25 years, and I look forward to many more.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged , | 5 Comments