Checking Tempdb with dbatools

I really like the dbatools project. This is a series of PowerShell cmdlets that are built by the community and incredibly useful for migrations between SQL Servers, but also for various administrative actions. I have a short series on these items.

In SQL Server 2016, the setup program was altered to better configure tempdb at installation time. This was in response to the observation that few people actually make any changes to the default configuration, which was suboptimal in SQL Server 2014-.

Going through and checking all of the configurations you have isn’t easy, and isn’t necessarily the type of work that anyone wants to do. dbatools makes this really easy and quick with Test-DbaTempDbConfiguration.

Using this cmdlet is easy. I’ll call this with an instance and get results of a number of checks that are useful for your tempdb configuration:

2018-04-20 09_25_07-cmd - powershell

This isn’t necessarily easy to read, so let’s add a Format-Table.

2018-04-20 09_24_57-cmd - powershell

That’s not great, as I’m missing the CurrentSetting field. I’ll add a SELECT and include the fields I want. I can even add multiple instances in here:

2018-04-20 09_30_56-cmd - powershell

Now I can scan through here, looking to see if my settings have deviated from the recommendations and best practices. This could easily be used to filter the results for items that don’t match, save the results as a CSV, and you now have a picklist of items to work on as you find time.

dbatools is an essential tool for me. I’d urge you to download the module and experiment with the cmdlets.

Posted in Blog | Tagged , , , | Leave a comment

Keep It All

I love this quote, though I’m not sure it’s accurate. From The Future of Data Storage, the piece states: “What’s the most expensive thing you can do with a piece of data? Throw it out.”

That’s from a storage vendor, and obviously they’d prefer that you keep all your data, which means more storage and backup space needed. Certainly I do think that losing data that’s valuable can be expensive, but I also think that we often keep around older data that we don’t use, or won’t use, which is expensive. Not for individual pieces, but in aggregate, it becomes expensive. This is especially true if you move to the service area where you pay for what you use, as opposed to investing in a large chunk of storage that has a fixed cost.

I didn’t really think a lot of the piece, though it did get me thinking about backups. I’ve run backups for my entire career, and in 99 point some number of nines cases, I haven’t ever used the backup file again. These were insurance against potential problems. Even in places where I restored the backup to verify the process worked, I often just discarded the backup file at some point.

Early in my career, we had tape rotation systems to reuse the media a certain number of times, while also ensuring that we had off site copies and specific points in time saved. Today there are plenty of backups systems that perform deduplication and complex disassembly or re-assembly of files from blocks to use space more efficiently. That doesn’t always work well for database restores, especially when time is of the essence.

As vendors look to add more intelligent, or at least more efficient, processing to backup systems, I wonder if they really think about databases and how we use files. I hope so, and I’d like something that was optimized for database restores. I don’t mind combining the duplicate parts of files into some index, but I need to have the latest files available for quick restores. What about backing up a database to a file and keeping this file online and ready. Then, after the next backup, move the previous one to an area that dedups it, maybe takes it offline, etc. That way I have the best of both worlds. I rarely go back further than the latest full backup for a restore, so keep this ready.

Of course, we need to consider log backups, which really need to be kept online and intact if they have been made since the last full backup. Keeping track of that is a pain, but it’s something software could easily handle. Once we’ve made a new full backup, you can mark older log backups for deduplication. Though, if you’re building this into a system, perhaps performing a restore of the full backup files automatically should be included as well.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.0MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

The Prompt Actions Menu

A quick post on SQL Prompt here. Someone is asking about the Prompt Actions menu. This is the menu that appears when you highlight code. I’ve got a quick animation of this working.


In the gif, I highlight code, click the Actions icon, and then I can add a BEGIN..END around my code. I also have a snippet that I use to surround code with comments.

Any snippet that has the $SELECTEDTEXT$ token in it is eligible for the Actions list.

Carly also has her own video on this on the Redgate Videos channel.

SQL Prompt is amazing, so give it a try today.

Posted in Blog | Tagged , , | Leave a comment

The Right Connection

Travis-CI had some staffers connect to the wrong database and truncate production tables. Needless to say this caused an outage and disrupted their business. Hopefully they didn’t lose too many customers, but they certainly did not help their reputation. I’m sure there are more than a few customers trying to decide if they continue to trust the company or move their Continuous Integration (CI) processes to another platform.

I’ve done this before. Not shut down a company, but I have actually truncated a production table by mistake. Well, not TRUNCATE, I mean, who runs that. But I have run a DELETE without a WHERE clause and killed a lookup table in a production database. Fortunately I had a copy of the table elsewhere and could rebuild it in minutes. Only a few customers had their work interrupted and only for a portion of our system. The point is that I’ve been a very good DBA, with a lot of success and experience, and I still make mistakes.

Often this type of mistakes comes about because we get busy, and we keep connections open to different systems. When we might be developing code against a schema that is close to production, it’s easy to forget which database we’re working on. Someone calls with a problem or we fight a fire, and we run some code. We fix the issue, stress bleeds away and we go back to work, but forget to switch connections or tabs. Then we run some code that would be fine in development, but causes issues in production.

SSMS has colors for a connectionSQL Prompt has tab coloring by system and database, as do some other products., which can help, but it isn’t perfect. One thing I’ve found with colors is that if I use them constantly, my mind starts to filter out the color. I don’t always realize the outline of the tab is a different color. This is especially true if I have the need to switch back and forth between both production and non-production systems. I’ve tried running two instances of SSMS, which helps, but at times I’ll forget which one I’m working with and make a connection to a production server from a non-production instance of SSMS.

Ultimately, we need to be careful. I know one friend that has no access to production and must hop through an RDP session and connect to a production database. However, if you run your RDP session in full screen, how often would you forget that you’re in the SSMS on the hop system and not in SSMS on your local machine.

I don’t know if there’s a good solution. Many of the convenience features that make life easier, like reconnecting tabs when I restart SSMS are great, however, they can compromise security and safety. I don’t know if there is a good solution, but I’d certainly like more checks against ad hoc issues occurring in production systems. Maybe some sort of lock against certain instances that prevents destructive execution on certain instances or databases without some confirmation. I love SQL Prompt preventing me from running code without WHERE clauses, but that isn’t always enough. At least not for me.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.4MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | 4 Comments

Python Jupyter Notebooks in Azure

There’s a new feature in Azure, and I stumbled on it when someone posted a link on Twitter. Apologies, I can’t remember who, but I did click on the Azure Notebooks link and was intrigued. I’ve gotten Jupyter notebooks running on my local laptop, but these are often just on one machine. Having a place to share a notebook in the cloud is cool.

Once I clicked on the link, I found these are both R and Python notebooks, as well as F#. These allow you to essentially build a page of code and share it. It’s kind of like a REPL, kind of like a story. It’s a neat way of working through a problem. I clicked the Get Started link to get going and was prompted for a User ID.

2018-03-29 10_23_24-Custom Selection

Once I had my moniker setup, the next step was to edit my profile. That’s more important than a library, right?

2018-03-29 10_23_32-Custom Selection

Of course, I needed to fill out the profile with my avatar and some information.

2018-03-29 10_24_30-Microsoft Azure Notebooks

Next I need to create a library, which I’m guessing is a collection of notebooks. I clicked the link and had to enter a name. I decided on the classic HelloWorld name. I decided to keep this public, as I might want to share this with others.

2018-03-29 10_25_01-way0utwest - Microsoft Azure Notebooks

I’ve got a library, now let’s add something. I clicked the, but it didn’t load. There was nothing there, as this is a blank file.

2018-03-29 (way0utwest) - Microsoft Azure Notebooks

I discovered I could right click the file in the list. This lets me edit it. Strange UX, but whatever. The file uses markdown as editing, which is fairly simple, but consists of a few characters to designate titles, lists, etc.

2018-03-29 10_27_08-HelloWorld (way0utwest) - Microsoft Azure Notebooks

I entered some text, and then my readme appeared below my notebook list, much like it goes on Github. My screenshot got taken after I’d experimented a bit, so you see a couple python notebooks as well.

2018-04-10 11_48_58-HelloWorld (way0utwest) - Microsoft Azure Notebooks

From there, I could add a notebook. I have choices. I started with Python, since that’s one of my learning goals.

2018-03-29 10_26_05-HelloWorld (way0utwest) - Microsoft Azure Notebooks

I give the notebook a name and create it.

2018-03-29 10_26_20-HelloWorld (way0utwest) - Microsoft Azure Notebooks

Once this is created, it appears as a Jupyter notebook. Essentially I have a repl-like command area, and once I enter code, I can click “Run” to execute it. You can see that my Hello, World program ran.

2018-03-29 10_26_45-Python Experiments

I can enter other code, and I’ve done a few things, just to practice some basics in Python. I’m working through some courses, and I’ll enter code in here to practice concepts.

2018-04-10 11_52_27-HelloWorld_Python Experiments.ipynb (way0utwest) - Microsoft Azure Notebooks

Jupyter notebooks are a good way of working through a problem and showing flow. They’re especially useful for sharing information with others and letting them follow your thought process.

Posted in Blog | Tagged , , | 1 Comment

Good Security Needs Layers

How many of you have wanted to know who started or stopped a SQL instance? Probably a few of you, as disruption to the service can affect customers. Most of us are concerned with the changes made inside SQL Server to objects and data, and that’s what the auditing features inside SQL Server are watching. The problem is that the database platform is dependent on the host OS, and as such, some actions take place at that level. Auditing inside SQL Server isn’t setup to capture this information.

Should you care? Well, restarts, or the stopping or a service are one way that a malicious actor could alter files, change the error log without you realizing it, or even copy files to other systems. All these actions might be outside of any auditing or event tracing you’ve set up. Good security needs multiple layers because the system you need to protect is often dependent on some other part of our infrastructure.

Databases depend on the host OS and perhaps directory services. Your OS may depend on a hypervisor, and certainly needs patching, so it depends on human administrators. Many of our systems depend on networking and firewall configurations. There are other layers, but the more that we can ensure each layer is secure, the better off we are. Certainly our systems always depend on humans not giving away credentials or installing malware, but that is often something many of us can’t control.

I ran across an article that explains how to use auditing at the Windows level to track this down, and ensure that there aren’t more unexplained restarts. You can implement this, but if you don’t have Windows administrative privileges, you’ll need to get help from someone that does. Likely a couple of you have been glad that there isn’t a great way to audit this from the OS, as you were the one performing a restart without permission. If that’s your MO, I expect you might not want to pass this piece along to your security staff or auditors. If that’s the way you work, though, I would advise you to change your habits.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.2MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

An Open Letter to PASS and John Q Martin

Dear Mr. Martin and the PASS Organization,

In the spirit of advocating for change in a public way, I’m writing this letter to ask you to revisit your policies and processes and embrace the transparency that we, members of the PASS organization, deserve with regards to the SQL Saturday events.

I am directing this to you, Mr. Martin, after seeing your tweet, with the hope that you will follow through and not only provide feedback, but conduct a root cause analysis that updates future decisions. I also am hoping these are not merely words, but also the beginning of some action that you take and about which you publicly disclose the results.

A change was recently made to the SQL Saturday site that requires speakers to register for an event before their submissions can be approved. I realized this when a fellow speaker posted a note on Twitter.  While there are potential implications here that are disruptive for event organizers and speakers, those are worthy of a separate debate. In fact, they are worthy of debate.

Today, I examined the PASS blog to look for an announcement. This what I saw.

2018-04-18 08_17_13-PASS Blog _ Community _ PASS - the Professional Association for SQL Server

The last update is from Mr. Fritchey, noting how much PASS wishes to express their affection for SQL Saturday. I appreciate the Board of Directors, most of whom are fellow speakers, may feel that way, but I’m not sure the organization believes this. Again, a separate discussion.

Early on in the life of SQL Saturday, a few people had to make decisions, but those decisions were publicized, with conversations with organizers and speakers. When fundamental changes were made, we announced changes publicly to ensure that others were informed. We setup a board of advisors to help ensure that change was made in a more open way, and to solicit feedback. Andy continued to provide updates later, even as PASS has declined to do so.

That doesn’t seem to have been done any longer. As I scan through the blog, I see no announcements of changes to SQL Saturday. The last posts were from 2016, when the 600 mile radius was announced and then a subsequent post explaining the reasoning.

Once again, almost two years later, PASS lacks transparency and vision. There are no discussions on slack that I find, NDA or otherwise, no emails I’ve received, no list of changes or updates.

What I would request is that you use this as an opportunity to improve governance at PASS. Examine the timeline of when this decision was made, the timeline, and look for opportunities to engage, or at least inform, the membership of PASS of changes. There should be milestones that require

  • an announcement of potential changes, NDA or public
  • a roadmap – is there a public one? Even one without dates.
  • a roadmap update
  • an announcement of an upcoming change
  • an announcement of the change, publicly
  • an update to the FAQ

I appreciate that there may not be funds for future changes, though there are obviously some still in progress. However, this is separate from funding. This is a process and governance issue that should be addressed.

It’s also simple common courtesy.


Steve Jones

PASS member since 1999

Posted in Blog | Tagged , | 13 Comments

Machine Learning Challenges

If you know someone well, you may think that you can predict what they’ll do in a situation. Those that know me might think that if I go to New York, I’ll make time to run in Central Park. That’s a good bet, and the last 7 or 8 times I’ve gone there to speak, that’s what I’ve done. However, that’s a very small data set, and you might not want to attach a high level of confidence to that prediction. There are certainly times that I’ve gone to New York and not run in Central Park.

Machine Learning (ML) is a field that tries to make predictions about some data, based on previous known actions or results. By inferring relationships between lots of data and known actions, the ML model can guess what might happen with future data inputs. Lots of companies and fields are experimenting with ML techniques, with some of them working quite well. Others aren’t faring as well, and there is a lot of research underway to better understand the entire field. One possible explanation is emergence, which is discussed in this piece.

If it’s raining, I likely won’t run. If it’s 20F, but dry, I will. Those that know me would inherently consider the type of weather as a new factor that has a high level of predictability for me. Time in the city, location, whether my family is with me, these are all other factors that would come into play. With enough data, and enough factors, you could likely come up with a model that predicted the likelihood of me running in Central Park when I visit New York City. It might not be perfect, but it would be good if you had enough data.

However, that’s a very simple item, and few decision points. Would that work for a larger data set? I bet it would. If we were able to collect data from lots of 5k events, we could perhaps predict how many people would register show up. There are certainly more factors at play, and since we can’t control things like weather, the prediction might not be that valuable. Though if we could change the amount of resources the day of the event, maybe that would help reduce costs or make the event more successful.

I think ML is a valuable tool that can help with quite a few problems, but in many cases, we just don’t know enough about the complex factors, such as emergence, that might impact us. I know our SQL Saturday organizers would love a model that helped them decide how much food or drinks to order for an event. My view is that there are likely too many unusual factors, but I wonder if we could start a data gathering project to come up with a model. Perhaps we should gather marketing metrics, registration metrics, and more, with the intention to predict the number of no-shows. Even if it wasn’t useful, it might be a fun project for people in this industry.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.6MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

Webinar: 6 Principles of the GDPR and SQL Provision

On April 24, I’ll be hosting a webinar that talks about the GDPR and how you can help ensure compliance in your development environments. I’ll talk about some of the issues and show how SQL Provision can help.


The GDPR is complex, but it certainly does ask us to protect and prevent issues with data from development environments. There are many ways you can try and ensure you don’t have any issues, and I’ll show one with Redgate tools.

Register today and I look forward to talking with you next week.

Posted in Blog | Tagged , , , | Leave a comment

In Depth Learning at the PASS Summit

This past week the pre-conference sessions for the 2018 PASS Summit were released. There’s a good list of topics and speakers, returning and new presenters, that you can choose from. If you’re planning on attending the Summit, I would review the options and if any of these are relevant to your position, I’d bundle the cost of a session into your request for Summit funding. I’ve seen plenty of people ask for the registration fee without a pre-con and then try to negotiate additional funds later. Do yourself a favor and ask for everything at once.

Are these sessions worth the time and money? Most are, and a single day packed with information is often easier to digest than a full week. However, you’ll still have days of random learning slots during the regular conference, so plan on spending a day learning and then shifting into a different mode for the rest of the conference. There is a lot of information presented in an all day session, and you’re likely to miss some of it. Most presenters provide a good set of materials to go with the class, but you should plan on taking lots of notes and then spending a few hours here and there across the next month reviewing and practicing what you’ve learned.

I’ve often found that a pre-conference session is the best way for me to dive deep into a topic that I haven’t had the time to investigate on my own. Time is a precious resource and finding time in my busy work and family life is hard, especially for a full week class. I’d still like to get to a week long sqlSkills class, but I can’t seem to find the time. Maybe after next year when I have no more children at home and fewer commitments.

I used to take a computer to classes, but the last few years I’ve moved back to paper and pen. I find that sketching out notes by hand is quicker and easier, I ignore typos, and somehow the act of writing cements things in my mind more. I’d recommend you try both, though certainly having your phone around to take a picture of something is incredibly handy at times. I’ve tried OneNote and a digital pen on a laptop, but again, it’s not as quick or smooth as pen on paper. The one thing I did like is that I could add pictures to my notes.

A conference is a large investment, whether you make it or your employer does. Adding in a day of training on a topic is often a relatively small incremental increase. It’s worth the time and cost, if you’re already going to be gone for multiple days. I’m not sure if I’ll get to a pre-con at the Summit this year, but I’m hoping to get a day of watching and learning on a single topic in at one of the events I attend this year.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.0MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged , | Leave a comment