Aging Code for T-SQL Tuesday #195

It’s that time of the month again, with T-SQL Tuesday coming along. I managed to not forget about this and checked with the host. He had an issue, but fortunately I got a friend to step up.

This month Pat Wright has an interesting question, asking how your code has aged. He and I have had a few conversations lately about getting older and when I asked him to host, this was a perfect choice.

I’m definitely getting older, but what about my code?

25 Years Later

Actually a little more, but I wrote a series called “Tame Those Strings” for Swynk a long time ago. That became Database Journal, but during the switch, they stopped paying us authors. A few of us started SQL Server Central and we went live 25 years ago.

In that piece, I referenced the oldest article, which is Tame Those Strings Part 4 – Numeric Conversions. There’s also a part 3, but in those pieces, is that code still useful?

A bit.

These are basic articles looking at string functions that are still heavily in use today. The idea of cleaning phone numbers using REPLACE is still something we might do today. If we are using SQL Server 2025, there are additional functions, but I still see a lot of code that still use multiple CHARINDEX+SUBSTRING or REPLACE functions.

I asked the Prompt AI if it could do better.

2026-02_0091

It gave me two pieces of code. The first is nested REPLACE() statements. This works, but I find this hard to read. I’d rather have separate statements, for ease of maintenance.

2026-02_0095

The second is a single statement, using a CASE and STUFF and XML to clean things. I like this, thought it’s a semi-complex way of doing things. However, it works.

2026-02_0096

Has the code aged well? I think it’s OK. There are better ways, as shown with the STUFF/XML version, which wouldn’t have worked in SQL Server 2000. Still, the use of REPLACE is a common technique still used today.

For part 4, with the use of LTRIM and STR(), today we have FORMAT, which is cleaner. However, it’s likely less performant. In a simple test of 500,000 values, the FORMAT takes over 600ms to return the results while my LTRIM/STR combination consistently runs in the 460ms or quicker range.

I think my code aged well.

Posted in Blog | Tagged , | 1 Comment

The DBA is Dead; Long Live the DBA

I remember getting a job at a startup in the Denver Tech Center. This was shortly after SQL Server 7 was released, with a marketing campaign that the platform was auto-tuning and wouldn’t require a DBA. My colleague asked me if I wanted to learn Cold Fusion and have a longer career. I declined and stuck with this SQL Server thing, which has seemed to work out pretty well over the years.

I was reminded of this when I saw a “Death of the DBA Again” post, this time from an Oracle DBA. There are plenty of links in there from Larry Ellison and Oracle about how some version of Oracle won’t require a DBA. I’ve seen questions on Reddit (and elsewhere ) about this topic where people seem to think DBAs can be replaced.

Or maybe they want them replaced.

There are no shortage of posts on why this isn’t the case (Grant, Kellyn, Brent, William, Boris). These all look slightly different, but the main thrust is that there is still data management-type work and people are needed to do it. Or maybe to direct the AIs to do it.

An interesting post from Kendra last year that we will see less DBA jobs because good DBAs can leverage AI to replace a few less-good DBAs. I like her approach, and the key reason why AI agent usage will grow is that they can potentially just make less mistakes than a human.

If that human making mistakes is you, then you might not have a job.

I do think that the DBA as a gatekeeper or a single point of managing systems and ensuring backups/security/patches are made is dwindling. However, there are still lots of places for database-related work. High Availability setups are needed; someone has to work with InfoSec and auditors and implement their requirements. That might be especially important as those requirements might not be clear and clean enough for all your systems. While ETL might be a thing of the past with the various “links”, without a doubt, people will link too many tables to analytics systems, leading to overloaded systems, too many resources being used, and costs being too high.

That might be a reason we will still need some type of DBA. They need to field the complaints from budget holders and work on resolutions to reduce costs.

The DBA will continue to exist in many organizations, but the job will change, and you need to evolve. There might be other organizations that don’t want a “DBA” as a title, but they will need a data engineer, a full-stack developer spending more of their time on the database stack, an InfoSec person that mostly works on database security, or some other job that absorbs all the data-related chores.

There is a lot of opportunity still out there, but the bar is being raised, and one end of that bar rests on AI. Improve your skills, show your value, and become someone who delivers results and doesn’t just say “No.”

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged | 2 Comments

Claude AI Convinced Me Not to Build an iPad App

I coach volleyball and I do a lot of stat stuff on paper. I decided recently to see if I could find a way to more easily automate things. I’ve tried a few apps on the iPad, but they all have too many restrictions and they are hard to use in the heat of the moment. Paper and pencil have been simple, reliable, and they let me fix something easily.

However.

Paper takes some focus, and it’s hard to quickly summarize things. I wanted to make my own app, and Claude convinced me not to do this.

Read on.

This is part of a series of experiments with AI systems.

A Simple Prompt

I’ve been kicking this around, but I never seem to have time. Listening to podcasts, reading articles, and seeing other experiments is slowly getting me to just try things. I started with this prompt, intending to give this 10 minutes out of my day.

how hard is it to build an app for an ipad

I know some of the answers, but you can look at the image below for what Claude gave me.

2026-01_0335

I like the conversational nature of this, so I answered with this:

a simple app to track some data entry. Can you build an app for an iPad?

I was just looking for help in my thought process here and trying to scope the work. I’m sure Claude Code could knock this out, but I’m experimenting. This was Claude’s response.

2026-01_0337

That’s great news. I don’t need a native app, and in fact, that’s likely a lot of overhead registering on a store or playing the sideload game, sending updates slowly, etc. With cloud file sync, I could easily have this data available elsewhere anyway.

This is the type of advice I’d get from some friends. Others would relish the chance to build an app and make this more complex. I’ve certainly seen some other Vb coaches on FB create complex systems.

My answer:

volleyball stats. Display a list of players in a 4×3 grid, like the image. Each player’s name is customizable. A date, goal, and time can be entered. For each player, display 4 buttons: 0,1,2,3 which are used to rate a pass. There should be a real time calculation of the average pass score for a player (sum of items/ attempts). Save this to a text file with the datetime data entry was completed.

I meant to upload the image below, but forgot. For context, I print this out and then as kids pass balls, I write 0, 1, 2, 3 for each pass. After we’re done (usually xx passes or time), I quickly compute an average. You can see below, Eliza gets a 3 and Ella gets a 2.

2026-01_0334

That takes time, so I wondered what Claude would say. This is the response, which took about 5 minutes.

2026-01_0338

I downloaded the file and opened it and saw this, or most of this. I updated the player names and they’re saved in the HTML.

2026-01_0339

I can click around and it logs attempts and calculates an average in real time. If I download the data, I get a text report of what happened.

If you want to try it yourself, look in my repo: https://github.com/way0utwest/AIExperiments/tree/main/VolleyballPassingStats

I asked for a few changes, and then I asked for a log of the session. That’s the readme in the repo.

To me, this was something I’ve put off for a few years, not wanting to get caught up in a project like this when I could easily just use paper.

Now I have a new toy that I’ll use at the next practice. Plus, I’ve amazed myself at how  easy this was.

Be curious, try things, ask for help on a task or project you’ve put off. GenAI is amazing.

Other AIs

I tried this on a local Gemma3 model and it was very slow and produced a single entry box. to me, this wasn’t worth the time or effort.

I tried this in Copilot (in VS Code) with the Sonnet model, and it wanted to make me an xCode app right away with a SwiftUI. I could get it to produce something like what I got below, but it wasn’t as helpful as Claude.

Video Walkthrough

You can see some of this live in this video.

Posted in Blog | Tagged , , | 2 Comments

When SQL Server Central Went Down

This is part of a few memories from the founders of SQL Server Central, celebrating 25 years of operation this month.

“The site is down.”

I got a phone call from one of the other founders around 9 am in Denver one day. At the time, I was working at a small startup, in a semi-private office with one other person. Most of the company knew I had a site on the side, and alternately cheered me on or celebrated hiccups, depending on the person.

I checked, and things were down. I had a side channel to telnet into the server, but couldn’t access things. At this time, the site was hosted on a single server, running IIS and SQL Server, in a friend’s basement. This was in 2002 or 2003, and broadband was a lot different then. No cable internet, and most other solutions were in the kbps range. I had ISDN at my house, but a friend had gotten into a trial from Sprint, giving him 3Mbps over a microwave link.

My office-mate was listening and watching. I said I was taking an early, and long, lunch. She chuckled and went back to her own tasks. I let my boss know and started driving, stressed out. After all, the site was growing, popular, and downtime could be a killer.

We’ve had a few outages over the years, but not many. This was one of the more stressful ones as I fought traffic on I-25, trying to get from the Denver Tech Center up to Westminster. I can’t remember the weather, but I was sweating when I got to my friend’s house. This was the before-times, when remote work was a rarity. Luckily, my friend had left me a key to get into his house, where I ran into the basement. Unable to get the server to respond, I rebooted it and crossed my fingers.

I’m sure a few of you have had the stressed-out feeling of waiting for a database server to restart, hoping that it comes back up cleanly. It did this time, and I don’t remember what went wrong, but things were back up and running with no major issues. I was probably in the basement for less than 30 minutes, and I left, dreading the long drive back to the office. Almost two hours driving for a 30-minute fix.

I do remember thinking if this was going to be a regular occurrence, I might need a new job. Or a new place to host the server.

We weren’t in the basement for too long. Revenues were increasing enough that I started to look at co-location facilities. My current employer had investigated quite a few when we set up our systems, and I reached out to a few contacts. I negotiated a half-rack at some point, moving our servers to a real facility. By that time, my employer had failed, and I got an F5 firewall as part of my severance, which fronted SQL Server Central for quite a few years.

Those were the days, with lots of CLI access to remote systems and a power strip where I could cycle them off and on. Those were skills I had to learn to avoid more drives at inconvenient times.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged | Leave a comment