Easy Database Space with dbatools

One of the things that I’ve seen DBAs query for is free space in a database. In fact, this is one of the most common questions posted on SQLServerCentral by newbies that are trying to manage their instances and ensure they don’t use too much space or run out of space.

The most common way to do this is with sp_spaceused. This system stored procedure has been around for a long time and is commonly used. However, it’s cumbersome to me. I get two result sets, I get data formatted in both MB and KB, it’s hard to use for multiple databases, and incorporating it with some automated tracking gets annoying. Not hard, just annoying.

Get-DbaDatabaseFreespace

Enter dbatools. One of the cmdlets included is Get-DbaDatabaseFreespace and this does exactly what you’d expect. This takes the name of an instance as a parameter, and when you run it, you get a bunch of information about your databases and their space usage.

2016-12-29 09_17_01-powershell

These are good numbers, but with PoSh, I can easily just get the data I want. For example, I can add a –Databases parameter and see specific databases.

2016-12-29 09_31_21-powershell

I can also limit reformat the data in a table and limit what’s returned:

2016-12-29 09_32_29-powershell

This flexibility to easily see data is one thing that I like about PosH. I could pipe this into a filter and look for those items that are of a certain size, or have a certain percentage value.

Putting this into a report, or even an alert that might get sent to DBAs is a great way to keep an eye on space and ensure that you don’t run out of space. Of course, tracking this over time is easier with a tool like SQL Monitor, but for quick checks, this dbatool is very handy.

If you haven’t played with these, I’d recommend you do this. This collection makes working with SQL Server and PoSh much easier.

About way0utwest

Editor, SQLServerCentral
This entry was posted in Blog and tagged , , , . Bookmark the permalink.

7 Responses to Easy Database Space with dbatools

  1. londondba says:

    Thanks for the article. Is there an option to see the percent used as a percentage of the max file size ? That is, the calculation takes into account the fact that the file size may not have autogrown to its max file size. If the percent used doesn’t take into account the max file size you may panic if you see a data file approaching, say, 95% used based on the current file size.

    • sqlvariant says:

      @LondonDBA There is a “MaxSize” property on the SqlServer…DataFile object that you could read to calculate the percentage that way if you wish. I like to know what the current size is so I can proactively grow my data files ahead of time, usually in off hours; but that’s just how I like to do it.

  2. Jeff Moden says:

    Good article on the simplicity of using PowerShell for things. Thanks for the article.

    But, it’s a bit deceiving when it comes to such incredible simplicity. For example, one of the things that’s typically missing from such articles is how to store this data in a table so that the DBA can develop a history to forecast growth and pre-order the extra disk space before it becomes a critical problem. It would also be nice to schedule this as an SQL Server Agent job to run at the same time each day to automate the collection effort. How would you do that? Is there a way to use a PowerShell task in SQL Agent that you could pass parameters to?

    I’m also missing something here. While I certainly enjoy the simplicity that that trip to PowerShell affords and wish that Microsoft would actually incorporate into the SQL Server product, this information isn’t that difficult to build into a stored procedure. I’m reminded by tons of people that “Just because you can do something in SQL Server, doesn’t mean you should”. Why doesn’t the same apply to PowerShell when it comes to things that are related to SQL Server. Why not use what’s available in SQL Server?

    And, to be sure, none of this is an attack on your good post. I’d really like to know these things because it would make them more useful not only to me but to the community that we both serve.

    • sqlvariant says:

      Jeff,
      I agree with you that storing the data in tables is important, that’s why I wrote this article for SQL Server Central 5+ years ago on how to store this kind of data in tables on your SQL Server: http://www.sqlservercentral.com/articles/PowerShell/73289/

      The great news is that things keep getting easier with PowerShell and it’s even easier to store data back into a SQL Server table thanks to improvements in SQL PowerShell available via SSMS 2016 (download latest version here http://sqlps.io/dl ). I have more articles in the pipeline to cover this topic so until they’re out I only have this blog post to offer, but it should give you a good idea of what is now possible. http://sqlvariant.com/2016/10/quick-blog-searching-sql-servers-saving-results-with-write-sqltabledata/

      I totally agree with you Jeff, “Just because you can do something in SQL Server, doesn’t mean you should”. And likewise, Data Professionals are applying the same rigor to what they do in PowerShell. Why wouldn’t they?

      You could setup a ton of linked servers and gather this information that way via T-SQL (would you need to cursor over the linked servers?); or you could use a simple PowerShell function such as this which can iterate over your Registered Server/ CMS list and collect all that same data for you, using the same exact T-SQL calls (spoiler alert: at the end of the day PowerShell calls T-SQL), without having to setup the Linked Servers. Heck, you don’t even need to create a PowerShell function in this case, just use the Invoke-SqlCmd cmdlet to call your stored procedure, and write the data back, like so:
      foreach ($RegisteredSQLs IN dir -recurse SQLSERVER:\SQLRegistration\’Database Engine Server Group’\ | where {$_.Mode -ne ‘d’} )
      {
      Invoke-Sqlcmd -ServerInstance $RegisteredSQLs.Name -Query “exec my_AwesomeProc” -OutputAs DataTables|
      Write-SqlTableData -ServerInstance MyCentralInstance -DatabaseName DBAToolbox -SchemaName inventory -TableName DataFileSpace
      }

      Thanks to SSMS and it’s backbone of the SMO this is available to you, so why not use it?

      As for SQL Agent, I and many other community members have been quite clear with the SQL Tools team about how things are today and how we would like them to be in the future. They have taken our feedback and are looking into making things easier there as well.

    • way0utwest says:

      Jeff,

      I’m not trying to produce an entire solution. I appreciate that sometimes people want to see more, but there are also plenty of people building some solution that want a short bit of information to solve a particular item. That’s what this is.
      I wouldn’t necessarily use this to track space over time. I have moved to the idea of using a monitoring solution and adding to it where I can. It’s better to buy, rather than build, such a system. I don’t say that because I work for a vendor but because it’s not a good use of my time.

      Putting the data into a table or storing it isn’t hard, as Aaron shows in his comment.

    • Jeff Moden says:

      Aaron and Steve,

      Thanks for the great feedback and links, Aaron. That really helps.

      You’ve just made my point, though. While I very much appreciate the nature of Steve’s great “spackle” article, he obviously has prior knowledge of you and your work. It wouldn’t have been difficult to simply include a small mention of “To easily and quickly capture and store such data in a table as well as how to express the code over multiple servers/instances, please see the following links to Aaron Nelson’s great articles on the subject.” and then include the links.

      Shifting gears to the subject of PowerShell, I am definitely NOT an anti-PowerShell zealot. I absolutely do agree that expressing such code over possibly hundreds of linked servers is just/really silly. I’m not a PowerShell ninja, either, but have been able to glean enough information from the internet to create a decent enterprise-wide “Disk Free Space” morning report driven by a PowerShell loop that calls on WMI cmdlets for each server in a list of servers. The problem with that was, at the time, there was no easy way to import the results from all of that. Sure, there were PoSh functions (like Chad Miller’s, which you cited in your 5 year old article) that you could download to do that but that added a level of complexity that I wasn’t prepared to deal with because I didn’t (and still don’t. I usually put in 60 hours a week) have the time to study (my problem, for sure) how to use PoSh to the level that was required to do that and no one explained how to do it from SQL Server as a scheduled job (which I really needed to do). I did eventually erg out a simplified, standalone method. My final product consists of two fairly simple stored procedures (one to collect the data, one to create and email a morning report) that make a call to PoSh.

      I strongly suspect that a whole lot of DBAs out there are in the same position, which prompted my response to you, Steve. It’s a great “spackle” article on a wonderful subject but a link or two on the subject of how to easily get that info into a table would have really made the article aces even for “just” a “spackle” article.

      Shifting gears again and on the unrelated subject of “sucks to be me” 😉 , I’m also in the unenviable position of NOT being able to take advantage of great tools such as “DBATools” for multiple reasons but mostly because our “corporate policies” don’t allow it. Their basis for that policy, and I actually agree with it for a large part, is that support for such tools could vanish in an instant. Even supposed “MS supported” items do that. Look what happened with the wonderful “Jet” drivers. When MS finally released SQL Server 64 bit, there was nothing to replace “Jet” and, because of its widespread use throughout the company, actually prevented the company from upgrading to 64 bit. There was a race on to replace all that code with other methods because MS hadn’t yet released the “ACE” drivers. Even after they did, it wasn’t actually supported (MS even advertised that is was NOT to be used as a “Jet” replacement) until recently (we’ll see how well they actually do at that!).

      Last but not least and contrary to what Aaron has suggested (and just a bit of a rant), I’ve found that “data professionals” are actually NOT applying the “same rigor” mentioned to PowerShell any more than they have with SQL Server or Managed Code. In fact, I’ve found the exact opposite to be true. A good example can be found in the subject of backups. Many very well known and highly “trusted” supposed “data professionals” with huge readership seemingly raced to show how clever they were in creating PoSh scripts that would solve everyone’s need for backups in a “wonderful” centralized manner with NO consideration, never mind an explanation, for what happens to those remote machines and their transaction log files if such centralized backups failed for any reason. Make no doubt about it… I DO understand that the subject of Steve’s great article isn’t of that critical nature, though.

      Finally, thanks for the tip on the latest release of SSMS and the improvements to SQLPS. I’ll also tip my hat to you, Aaron, for making it so that one can select and copy your good examples for experimentation instead of just posting graphical screen shots of code as so many do. Ummm… and yeah… sorry, Steve… that turns out to be a bit of a shot at you. Fortunately, your examples are short enough to make that not much of a problem for this blog post but it’s something to consider for future posts.

      To be sure, though, thank you both for what you do and thank you both for the great articles. I just wanted to explain some of the frustrations I have to an ol’ friend and whomever else was listening.

  3. sqlvariant says:

    • I need to publicize the Invoke-SQLCmd / Write-SqlTableData combo harder. Working on that.
    • I’m working to explain SQL PowerShell the easy way for people and have 27 article ideas in my queue that I’m working my way through.
    • This is not only good spackle, I suspect Steve was going for a ‘building blocks’ approach in showing folks this particular command.
    • For scheduling PowerShell commands in a SQL Agent Job, just use cmdexec and call PowerShell.exe for. That’s all I have to say today.
    • dbatools relies on SMO, which I guess could go away, but it’s all text, nothing fancy. If nothing else, I encourage you to use it as a roadmap to building your own commands.
    • I am so thankful that I am not encountering many/any data professionals who are careless with where they try to use PowerShell. If you have examples of that, please email them to me (fname}at{MyBlog.com) and maybe I can help better frame the reason that person thought that was important to share.
    ○ I’m here to help build this community, not everyone has been blessed with the editors I have the luxury of calling on. (They fight me on my ideas. a lot. It’s great!)
    ○ When it comes to SQL Server, PowerShell is mostly for your elbow (I will have to write that up as an article someday soon).
    • Funny you mention those SMO examples.
    ○ If it hadn’t been for them being so complicated and me wanting to explain the easy obvious way to explain how they should have been done, I would have probably never got started talking about PowerShell
    ○ Super easy to explain what’s going on there, it’s the same as if you had used SSMS from the machine that is executing the code.
    ○ Currently, I think backing up databases with PowerShell is great for adhoc scenarios http://www.sqlservercentral.com/articles/PowerShell/151510/
    • This isn’t my blog, but I will still take the heat for Steve’s screenshots. As soon as you mentioned that, I realized that as the leader of the PowerShell VG of PASS *I* have not done enough to spread to word about Get-History & Start-Transcript. Guess what we will talk about at the start of the next two PowerShell VG meetings?
    • Heck I should also do a post about tips for writing about PowerShell.
    • As a second generation DBA (my dad taught me and was an Oracle DBA) I think Steve is spot on with this example and is doing a great service to Data Professionals by showing these in the ‘nugget’ form that he is. Jeffrey Snover says something to the effect of ‘PowerShell (like many technologies) doesn’t make sense to people unless their hair is on fire and PowerShell ends up being the bucket of water for their problem. So I think Steve is going to reach some people and show them a bucket of water that may sit around unused until one day when their SAN is almost full and they need to start moving things around to make things fit, or some similar scenario.

    I think I understand your frustrations and I’m working hard to get more info and great examples out to everyone in the coming weeks!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s