PowerPoint to HTML with Claude AI

I had an idea for an animated view of a sales tool, and started to build this in PowerPoint. I decided to switch to Claude and ended up with an interactive one-page web app. This post looks at how I approached the process.

This is part of a series of experiments with AI systems.

Editing in PowerPoint

Someone sent me a slide in PPT and asked me for feedback. I wanted to adjust the look, and started to make changes in PPT. I was moving boxes, changing a few colors, and adding something that I thought would have more impact.

I was about to start building some animations to show how I envisioned things working when it occurred to me this would be a better as an interactive, more self-service item.

I opened Claude and this was my prompt. I uploaded an image of my slide.

2026-01_0265

In 3 minutes, I had a working one-page web app that I could click around in and see changes. I had one more prompt as my gauges weren’t working correctly.

2026-01_0271

I took another moment to fix the UI as it had 3 of my 5 items on one line and then I had to scroll to see the others. My last prompt fixed this.

2026-01_0272

After getting a slide asking for feedback from a colleague, and spending 10 minutes futzing in PPT, I switched to Claude and had a working, interactive showcase of what my mind was thinking,thanks to Claude (free Claude).

I sent it back and my colleague loves it. I don’t want to quite show this here, as we may use this in Redgate and someone will need to make it waaaaayyyy prettier than I can or even prettier than I can describe to the LLM.

How Useful is This?

You might say that going from an image or slide to an interactive picture isn’t useful. You might be right, and to be fair, I’ve never had the need to do this.

However, if I think a little wider about the idea of making something interactive or expressive, I have had the need for this type of thing before.

For example, I’ve needed a list of things that I want people to do and check them off. I’ve built this in word, with the little checkboxes, and given it to people. However, they have to remember to save as, or where it is, etc. Here, I could just give a URL to this in docs (or in code) and people could open it and live check boxes as they’re walking through a procedure. I could even have it change colors or notify when everything in a step is done.

Or display a warning if a step is skipped. How useful would that be?

I  have also had the need to ask a group of people a question and get their answers or ratings. How easy would it be to have a link I can send, detect the user (I’m sure AI can add this) and then record their vote, letting them override their decision at any time?

What about a page people can use to plan vacation or anything and send me back their decisions? This saves everything in a file.

What would your imagination let you build?

Posted in Blog | Tagged , , | Leave a comment

25 Years of SQL Server Central

The oldest article we have on the site is Tame Those Strings! Part 4 – Numeric Conversions, by me. It’s dated 2001-04-18, though I think that’s a date we picked when we converted all the content from one database to another. The founders agreed sometime during Feb 2001 to jointly run SQL Server Central. Since we each owned the copyright of our articles from another site, we migrated several articles to build up our content library. This was back when Andy, Brian, and I all had full-time jobs and managed the site during breaks, nights, and weekends.

That was 25 years ago.

Twenty. Five. Years.

It’s incredible to think that almost half my life has been spent working with this community. That joint effort morphed into a full-time job for me sometime in late 2003 or 2004. I took a pay cut to run the site, though as we grew from one to two to five to six newsletters a week, we started to make enough money to make up the difference. I was a horrible salesman, but fortunately, we had a great site that kept growing week after week, and we didn’t need to rely on my salesmanship. The site grew from dozens of users when we started to thousands in a few months to tens of thousands in a year. Eventually, we reached a million registered users, which was quite a milestone for us.

Apart from the site, we published books and gave out copies at our annual party during the PASS Summit. That party was one of the highlights of my year. We also used to publish a magazine in partnership with the PASS organization. That was a stressful time, with me trying to manage an every-other-month schedule for the magazine, which had to be laid out, printed, and shipped to subscribers. While that was going on I had to keep a couple of yearly book projects going and still get daily articles published.

I started writing these editorials because I was a little bored with the job. I never imagined how popular these pieces would become and how many people would read them. I suppose I should have as I was the one who negotiated and paid for our emailing software. We used to pay for Lyris Listmanager, which cost a few thousand dollars when we started. As we grew, we needed to send more emails overnight. One year I received a quote from Lyris for a few hundred thousand dollars to add the additional sending capacity. When I called the sales rep, he told me the only small companies sending more emails than us were the porn people. Needless to say, Andy took that as a challenge, not wanting to pay hundreds of thousands of dollars for email software.  we designed a system using an SMTP component that would let us send a lot more emails. At our peak, we were sending over 8 million emails a week.

I had to learn a lot about running this site, from SMTP tricks and the how CAN-SPAM act applies to negotiating advertising contracts with customers. I had to manage hosting locations in the early 2000s. We first rented a VM, but they were too small after about six months. We moved to the house of a friend of mine, where he had 3Mbps broadband connection (this was 2002). At the time, I only had an ISDN connection, which wouldn’t cut it. We migrated through a few different co-location facilities in the Denver area that I had worked with as a corporate employee. Those moves entailed me physically moving servers into cages (or partial cages) in cold rooms, re-configuring our switch and firewall, and ensuring everything connected to the Internet. I even had an account at Dell as we regularly upgraded hardware.

When we sold the site to Redgate, some of those hassles went away, and I could focus on just being the editor of the site. I no longer had hosting responsibilities or even coding ones. Things were good and bad with that change . Good as I had developers to whom I could send bugs, but bad in that they had other, higher priorities. In the last few years, I’ve struggled to get things enhanced or fixed on the site, though I’ve been promised that is changing this year.

Despite all the changes over the years, I’m still thrilled to be the editor of SQL Server Central and glad that Redgate continues to run and support the community. Most of my time is spent doing other work with Redgate, but managing this site continues to be a significant portion of my work week.

And I still enjoy it.

I want to thank everyone who has read an article, asked or answered a question, syndicated their blog, tried the Question of the Day, written an article, or just left a comment on a piece. This has been an amazing community where many of you learned to be a better data professional. Lots of you asked, debated, and shared your knowledge with others in an extremely neighborly way. It’s been a joy to see this community grow into one where we appreciate, value, and love each other. I’ve made many friends here, met many of you in person, and seen you get a value from this community that cannot be measured. The success of this community is because of all of you.

I’m blessed to have joined you here for 25 years, and I look forward to many more.

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged , | 5 Comments

Monday Monitor Tips: SQL Auditing Preview

One of the features we advocates have been advocating for is a better way to track security changes in your SQL Server instances. The first slice of this work is in preview (as of 12 Jan 2026) and this post looks at what’s available.

This is part of a series of posts on Redgate Monitor. Click to see the other posts.

Tracking Security Changes

The first iteration of tracking security changes queried instances and databases for information, stored it, and then compared it with other queries to determine what had changed. This was done hourly, and worked well, but it could not determine exactly when a change was made.

SQL Audit is made to capture this information in a lightweight way This works well, although the tooling (IMHO) is poor and hard to work with. Redgate Monitor is going to overlay this and make it easy for DBAs, InfoSec, and auditors to better understand what is happening in a SQL Server Environment.

There is a new tab in Redgate Monitor Enterprise Permissions page that contains this data. This is listed as “SQL Audit” and you can see this below.

2026-01_0105

Each row in here gives the time of the change, as detected by SQL Audit. If I expand the first column, I can see the details. In this case, we have a regular workload running to change these so that the demo site has data, hence you are likely to see the same data every day on monitor.red-gate.com.

2026-01_0233

The last column in the right has the command captured, with PII redacted, as you can see here. The reason you may see only the CREATE LOGIN items and not DROP LOGIN is this first slice of work is just getting the additions, so you can catch those hackers trying to add accounts.

2026-01_0234

The SQL Audit documentation page explains how this works, and keep checking this as there is a team enhancing these features on a regular basis and adding more events.

As with most of the pages in Redgate Monitor, you can filter and customize what data is displayed. You can also export your data as an csv file you can give to others. You have the option to get all data or just filtered data.

2026-01_0235

Summary

This post shows how a new preview feature in Redgate Monitor Enterprise uses SQL Audit to gather data on specific actions that are being taken on your SQL Server instances. This is a useful feature many customers have requested and it is being actively enhanced, so feedback is appreciated.

If you have feedback, please let us know as we value your opinions and comments on how we shape the future of Redgate Monitor.

Redgate Monitor is a world class monitoring solution for your database estate. Download a trial today and see how it can help you manage your estate more efficiently.

Posted in Blog | Tagged , , | Leave a comment

There Are a Lot of Databases

I was reading Andy Pavlo’s end-of-year review of the database world. He’s done this for a number of years, and there are links to previous recaps in the piece. He is an associate computer science professor at Carnegie Mellon University, working on quite a few database-related projects. In the review, he tends to track the database world from the perspective of business success and money. There are certainly parts of it that discuss technical changes, but my overall impression is more about the business and usage success than it is about the way database systems work.

The main thing that struck me after reading the review was how many database systems there are in the world. I hadn’t heard of any of these: RaptorDB, TigerData, Tembo, StormDB, Translattice, FerretDB, DocDB, SpiralDB, Tantivy, SkySQL, HeavyDB, and more. I’m sure I missed listing some I didn’t recognize, and quite a few of these are PostgreSQL-based systems, but still, that’s a lot of database systems that exist and are having success.

Last year, I ran into someone who worked at a company that had implemented ArangoDB for the software their company sold. This system had something to do with tracking parts and managing schematics for machines, which is a great place to use a graph database. I asked them why they didn’t pick a more well-known and used graph database like Neo4j. He answered that cost was a big reason, but if Arango failed to wrok, they felt could port their data over to another platform. He did mention that training new people was a challenge, which I believe is a good reason to stick with more mainstream systems. However, I understand that people placing bets on less well-known technologies is how the popularity of those platforms grows.

As a side note, I keep confusing ArangoDB with AvacadoDB. Maybe because I like guacamole.

If I look at DB-Engines, I see lots of platforms I recognize and a few I don’t, but overall this is a long list. Some you could argue aren’t really database platforms, but these are platforms people report they are using. There are 429 ranked, which is quite a few. I’m not sure there are that many different models of cars being produced in the US each year.

Many of these are specialized platforms and might be suitable or even preferred in certain situations. I wonder if any of you reading this are running Hazelcast or Presto. Or anything else unusual. If you are, why? What’s better about one of these systems than the top 5-10 in any category?

As I look around I realize there are so many databases available to choose from. Perhaps it’s just me, but I prefer choosing from a small list rather than a huge one. Do you feel the same way?

Steve Jones

Listen to the podcast at Libsyn, Spotify, or iTunes.

Note, podcasts are only available for a limited time online.

Posted in Editorial | Tagged | Leave a comment