Cloud Snake Oil

I’m sure that those of you reading this have a variety of opinions about the “cloud”. Actually, I’d guess that many of us have different definitions of what the cloud actually means. That’s fine, since it’s really an amorphous, marketing term that encompasses quite a bit of different technologies, services, and products from many different companies. Some of you might use the cloud, and if you do, then perhaps this will ring true to you.

I was reading Dr. Greg Low’s blog, and he asked the question in a post about what a managed service was. In this case, Dr. Low was looking to host his blog on some service, and apparently the definition of a managed service varied from provider to provider. His first provider didn’t tell him that backups were being run, with each file being counted against the space that he’d contracted for. When he asked for them to be deleted, he was told it would take a day or two as there wasn’t anyone to provide the service.

He continues looking at how other providers define service, which does vary, but the interesting thing to me is that many of these companies aren’t really providing management of systems. They’re selling you a product, which has some capabilities, but they aren’t really managing anything. At least, that is my impression. I know if someone asked me to manage a system, I’d expect to deliver some level of service that would be useful for the client.

In the cloud, it’s really a wild west version of computing, where companies want to sell you some service, often touting various management aspects, but they may not necessarily provide the level of service you expect. Cloud vendors, even worse than other computing vendors I’ve dealt with, want to work at scale, and they want to standardize how things work as much as possible. They don’t want to engage in person to person communications if possible. I learned this lesson with Google and their products, few of which had any way for a user to contact a help desk.

Apart from that, what I’ve seen too often in the cloud is that a company wants to offer some service or capability, but they don’t often have the tooling available for end users. This is especially true for new services, where it seems the purchase process works flawlessly, but the configuration or cancellation process doesn’t work, or might not even exist.

The one piece of advice I’d pass on from my cloud experiences is that anyone using services needs to reconcile their bills regularly. We can add resources easily, but removing them is hard, and often a customer service person promising removal doesn’t follow through. I’ve had people in the support centers not even be sure of what resource I was referring to when requesting removal or credit. It’s a frustratrating experience that has led me to adopt another habit. I grow resources very slowly, ensuring that I know what the billing is and that I really need the service. That seems like the opposite of what the “cloud” is supposed to offer.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.7MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

Watching the Sands of Suggestion in #SQLPrompt

I enjoy themes, and when I ran across the SQL Prompt Treasure Island, I had to take a few minutes and go through it. I wrote about Code Snippet Code recently, and this post continues to move across the map.

Incredible Suggestions

The first thing that most people notice after they install SQL Prompt is the suggestions that pop up as you type SQL code. This was one of the first things that captivated my interest after I started using the product. Seeing lists of tables, of columns, of valid syntax, pop up as I type is so useful that I struggle when I don’t get the suggestion box. By default, this appears quickly, and one of my customizations is that I have slowed it slightly so that quick typos don’t pop the box if I get rid of them immediately.

However, as the Treasure Map shows, it’s not just suggestions, but also the fact that I can hover over a suggestion and get more data, like the code that defines a table, view, or procedure. This is especially handy when using views, as I’ll look to see if I’m starting to nest views in queries.

I do find that CTRL+Shift+D is one of the shortcuts that I often need. I may create a table or run some code in another tab, and I don’t get refreshed suggestions automatically. There is an experimental feature to auto refresh suggestions, but this means more polling of the database, and I try to avoid adding more load to processes that are running. The shortcut works well for me.

The Dependencies Tip

SQL Prompt has lots of features, and plenty that I don’t use. A few I don’t use because I don’t know about them. The dependencies tip is one of these. I didn’t know about it until I read the Treasure Map post, but now I think that’s a really cool features. If I’m looking to alter my schema, one of the important things to know is what dependencies I have. Certainly I could use SQL Search, but being able to quickly decide what objects I need to consider, or where I need to make other changes, is great.

The Treasure Map describes this, but I had to experiment a bit to understand how this works. I’ve added a short animation to show this.

dependencies

Now that I know this, I’ll get the list, copy it, and use it as a TODO list of things to alter in this same commit. I’ll also know where to test changes before I actually commit this code to a shared repo where others will see it.

There are plenty of other small features in this list, and you ought to experiment with these if you write a lot of T-SQL. You’ll find them to be helpful and handy.

Posted in Blog | Tagged , , | Leave a comment

New IT Departments

I had a friend that used to run an Exchange system. Actually, he was part of a team of four that managed a 50,000+ mailbox system for a very large company. In 2000 or 2001, he told me that his job would be done by computers in a few years and he had decided to leave the industry. Over the last 15 years, he’s worked in another field.

I have no idea if those Exchange systems are still around, and I would concede that mail is better purchased as a service for most organizations than managed in-house. However, I think my friend made a mistake. There are still plenty of people working in technology infrastructure in companies, making a good salary in good working conditions. I’ve spent my career in IT in one way or another, either as a developer, Operations staff, or manager. I see no sign of this going away quickly, though certainly many menial, simple tasks like checking logs and backups are being increasingly handled by automated systems.

When I see articles like this one (Why IT as you know it is dead), I’m not sure what to think. One one hand, I do think IT is changing, especially in larger organizations, where there is pressure to reduce costs (often labor) as well as increase the speed of output. DevOps is one way that we try to improve our systems, though the cultural change is very hard. Often this means that developers produce work in smaller chunks, and may release more often, but don’t get more work done. This is because the cultural change is hard, and most of us don’t want to change our habits.

On the other hand, I also think that in many ways IT is the same. We can’t respond as quickly as business analysts or customers come up with ideas. I know most of those ideas probably aren’t great, and IT doesn’t want to waste resources on something that will not prove to be valuable. Just as happened 30 years ago, departments will create their own POC applications. This used to happen in Lotus 1-2-3, then Access and Excel, now it may happen with low-code development tools, whatever those are.

I don’t really worry about this, as I’ll find ways to make things better. If someone wants an Access or Power BI application, let them build it. If it’s really useful, and others need access, we can upgrade and invest in a better system. I’ll go along and get along, working to build the things that the organization finds useful. I just realize that my time is limited, and if someone else can prove a concept is valid, perhaps that means I should spend time ensuring that works or gets rebuilt in a better way. I also know many of those ideas and concepts won’t prove themselves, so it’s fine if there’s some sort of citizen development (or shadow IT) in an organization.

To me, the key is that we enforce security for our data. If anyone wants to build software, that’s fine. They just need to ensure that they use the same security and authentication mechanisms that other systems use. We need to protect the data, no matter what application is going to be used to view, analyze, or manipulate it.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.0MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

Checking Tempdb with dbatools

I really like the dbatools project. This is a series of PowerShell cmdlets that are built by the community and incredibly useful for migrations between SQL Servers, but also for various administrative actions. I have a short series on these items.

In SQL Server 2016, the setup program was altered to better configure tempdb at installation time. This was in response to the observation that few people actually make any changes to the default configuration, which was suboptimal in SQL Server 2014-.

Going through and checking all of the configurations you have isn’t easy, and isn’t necessarily the type of work that anyone wants to do. dbatools makes this really easy and quick with Test-DbaTempDbConfiguration.

Using this cmdlet is easy. I’ll call this with an instance and get results of a number of checks that are useful for your tempdb configuration:

2018-04-20 09_25_07-cmd - powershell

This isn’t necessarily easy to read, so let’s add a Format-Table.

2018-04-20 09_24_57-cmd - powershell

That’s not great, as I’m missing the CurrentSetting field. I’ll add a SELECT and include the fields I want. I can even add multiple instances in here:

2018-04-20 09_30_56-cmd - powershell

Now I can scan through here, looking to see if my settings have deviated from the recommendations and best practices. This could easily be used to filter the results for items that don’t match, save the results as a CSV, and you now have a picklist of items to work on as you find time.

dbatools is an essential tool for me. I’d urge you to download the module and experiment with the cmdlets.

Posted in Blog | Tagged , , , | Leave a comment

Keep It All

I love this quote, though I’m not sure it’s accurate. From The Future of Data Storage, the piece states: “What’s the most expensive thing you can do with a piece of data? Throw it out.”

That’s from a storage vendor, and obviously they’d prefer that you keep all your data, which means more storage and backup space needed. Certainly I do think that losing data that’s valuable can be expensive, but I also think that we often keep around older data that we don’t use, or won’t use, which is expensive. Not for individual pieces, but in aggregate, it becomes expensive. This is especially true if you move to the service area where you pay for what you use, as opposed to investing in a large chunk of storage that has a fixed cost.

I didn’t really think a lot of the piece, though it did get me thinking about backups. I’ve run backups for my entire career, and in 99 point some number of nines cases, I haven’t ever used the backup file again. These were insurance against potential problems. Even in places where I restored the backup to verify the process worked, I often just discarded the backup file at some point.

Early in my career, we had tape rotation systems to reuse the media a certain number of times, while also ensuring that we had off site copies and specific points in time saved. Today there are plenty of backups systems that perform deduplication and complex disassembly or re-assembly of files from blocks to use space more efficiently. That doesn’t always work well for database restores, especially when time is of the essence.

As vendors look to add more intelligent, or at least more efficient, processing to backup systems, I wonder if they really think about databases and how we use files. I hope so, and I’d like something that was optimized for database restores. I don’t mind combining the duplicate parts of files into some index, but I need to have the latest files available for quick restores. What about backing up a database to a file and keeping this file online and ready. Then, after the next backup, move the previous one to an area that dedups it, maybe takes it offline, etc. That way I have the best of both worlds. I rarely go back further than the latest full backup for a restore, so keep this ready.

Of course, we need to consider log backups, which really need to be kept online and intact if they have been made since the last full backup. Keeping track of that is a pain, but it’s something software could easily handle. Once we’ve made a new full backup, you can mark older log backups for deduplication. Though, if you’re building this into a system, perhaps performing a restore of the full backup files automatically should be included as well.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.0MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

The Prompt Actions Menu

A quick post on SQL Prompt here. Someone is asking about the Prompt Actions menu. This is the menu that appears when you highlight code. I’ve got a quick animation of this working.

promptactions

In the gif, I highlight code, click the Actions icon, and then I can add a BEGIN..END around my code. I also have a snippet that I use to surround code with comments.

Any snippet that has the $SELECTEDTEXT$ token in it is eligible for the Actions list.

Carly also has her own video on this on the Redgate Videos channel.

SQL Prompt is amazing, so give it a try today.

Posted in Blog | Tagged , , | Leave a comment

The Right Connection

Travis-CI had some staffers connect to the wrong database and truncate production tables. Needless to say this caused an outage and disrupted their business. Hopefully they didn’t lose too many customers, but they certainly did not help their reputation. I’m sure there are more than a few customers trying to decide if they continue to trust the company or move their Continuous Integration (CI) processes to another platform.

I’ve done this before. Not shut down a company, but I have actually truncated a production table by mistake. Well, not TRUNCATE, I mean, who runs that. But I have run a DELETE without a WHERE clause and killed a lookup table in a production database. Fortunately I had a copy of the table elsewhere and could rebuild it in minutes. Only a few customers had their work interrupted and only for a portion of our system. The point is that I’ve been a very good DBA, with a lot of success and experience, and I still make mistakes.

Often this type of mistakes comes about because we get busy, and we keep connections open to different systems. When we might be developing code against a schema that is close to production, it’s easy to forget which database we’re working on. Someone calls with a problem or we fight a fire, and we run some code. We fix the issue, stress bleeds away and we go back to work, but forget to switch connections or tabs. Then we run some code that would be fine in development, but causes issues in production.

SSMS has colors for a connectionSQL Prompt has tab coloring by system and database, as do some other products., which can help, but it isn’t perfect. One thing I’ve found with colors is that if I use them constantly, my mind starts to filter out the color. I don’t always realize the outline of the tab is a different color. This is especially true if I have the need to switch back and forth between both production and non-production systems. I’ve tried running two instances of SSMS, which helps, but at times I’ll forget which one I’m working with and make a connection to a production server from a non-production instance of SSMS.

Ultimately, we need to be careful. I know one friend that has no access to production and must hop through an RDP session and connect to a production database. However, if you run your RDP session in full screen, how often would you forget that you’re in the SSMS on the hop system and not in SSMS on your local machine.

I don’t know if there’s a good solution. Many of the convenience features that make life easier, like reconnecting tabs when I restart SSMS are great, however, they can compromise security and safety. I don’t know if there is a good solution, but I’d certainly like more checks against ad hoc issues occurring in production systems. Maybe some sort of lock against certain instances that prevents destructive execution on certain instances or databases without some confirmation. I love SQL Prompt preventing me from running code without WHERE clauses, but that isn’t always enough. At least not for me.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.4MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | 4 Comments

Python Jupyter Notebooks in Azure

There’s a new feature in Azure, and I stumbled on it when someone posted a link on Twitter. Apologies, I can’t remember who, but I did click on the Azure Notebooks link and was intrigued. I’ve gotten Jupyter notebooks running on my local laptop, but these are often just on one machine. Having a place to share a notebook in the cloud is cool.

Once I clicked on the link, I found these are both R and Python notebooks, as well as F#. These allow you to essentially build a page of code and share it. It’s kind of like a REPL, kind of like a story. It’s a neat way of working through a problem. I clicked the Get Started link to get going and was prompted for a User ID.

2018-03-29 10_23_24-Custom Selection

Once I had my moniker setup, the next step was to edit my profile. That’s more important than a library, right?

2018-03-29 10_23_32-Custom Selection

Of course, I needed to fill out the profile with my avatar and some information.

2018-03-29 10_24_30-Microsoft Azure Notebooks

Next I need to create a library, which I’m guessing is a collection of notebooks. I clicked the link and had to enter a name. I decided on the classic HelloWorld name. I decided to keep this public, as I might want to share this with others.

2018-03-29 10_25_01-way0utwest - Microsoft Azure Notebooks

I’ve got a library, now let’s add something. I clicked the Readme.md, but it didn’t load. There was nothing there, as this is a blank file.

2018-03-29 10_25_34-HelloWorld_README.md (way0utwest) - Microsoft Azure Notebooks

I discovered I could right click the file in the list. This lets me edit it. Strange UX, but whatever. The file uses markdown as editing, which is fairly simple, but consists of a few characters to designate titles, lists, etc.

2018-03-29 10_27_08-HelloWorld (way0utwest) - Microsoft Azure Notebooks

I entered some text, and then my readme appeared below my notebook list, much like it goes on Github. My screenshot got taken after I’d experimented a bit, so you see a couple python notebooks as well.

2018-04-10 11_48_58-HelloWorld (way0utwest) - Microsoft Azure Notebooks

From there, I could add a notebook. I have choices. I started with Python, since that’s one of my learning goals.

2018-03-29 10_26_05-HelloWorld (way0utwest) - Microsoft Azure Notebooks

I give the notebook a name and create it.

2018-03-29 10_26_20-HelloWorld (way0utwest) - Microsoft Azure Notebooks

Once this is created, it appears as a Jupyter notebook. Essentially I have a repl-like command area, and once I enter code, I can click “Run” to execute it. You can see that my Hello, World program ran.

2018-03-29 10_26_45-Python Experiments

I can enter other code, and I’ve done a few things, just to practice some basics in Python. I’m working through some courses, and I’ll enter code in here to practice concepts.

2018-04-10 11_52_27-HelloWorld_Python Experiments.ipynb (way0utwest) - Microsoft Azure Notebooks

Jupyter notebooks are a good way of working through a problem and showing flow. They’re especially useful for sharing information with others and letting them follow your thought process.

Posted in Blog | Tagged , , | 1 Comment

Good Security Needs Layers

How many of you have wanted to know who started or stopped a SQL instance? Probably a few of you, as disruption to the service can affect customers. Most of us are concerned with the changes made inside SQL Server to objects and data, and that’s what the auditing features inside SQL Server are watching. The problem is that the database platform is dependent on the host OS, and as such, some actions take place at that level. Auditing inside SQL Server isn’t setup to capture this information.

Should you care? Well, restarts, or the stopping or a service are one way that a malicious actor could alter files, change the error log without you realizing it, or even copy files to other systems. All these actions might be outside of any auditing or event tracing you’ve set up. Good security needs multiple layers because the system you need to protect is often dependent on some other part of our infrastructure.

Databases depend on the host OS and perhaps directory services. Your OS may depend on a hypervisor, and certainly needs patching, so it depends on human administrators. Many of our systems depend on networking and firewall configurations. There are other layers, but the more that we can ensure each layer is secure, the better off we are. Certainly our systems always depend on humans not giving away credentials or installing malware, but that is often something many of us can’t control.

I ran across an article that explains how to use auditing at the Windows level to track this down, and ensure that there aren’t more unexplained restarts. You can implement this, but if you don’t have Windows administrative privileges, you’ll need to get help from someone that does. Likely a couple of you have been glad that there isn’t a great way to audit this from the OS, as you were the one performing a restart without permission. If that’s your MO, I expect you might not want to pass this piece along to your security staff or auditors. If that’s the way you work, though, I would advise you to change your habits.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.2MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

An Open Letter to PASS and John Q Martin

Dear Mr. Martin and the PASS Organization,

In the spirit of advocating for change in a public way, I’m writing this letter to ask you to revisit your policies and processes and embrace the transparency that we, members of the PASS organization, deserve with regards to the SQL Saturday events.

I am directing this to you, Mr. Martin, after seeing your tweet, with the hope that you will follow through and not only provide feedback, but conduct a root cause analysis that updates future decisions. I also am hoping these are not merely words, but also the beginning of some action that you take and about which you publicly disclose the results.

A change was recently made to the SQL Saturday site that requires speakers to register for an event before their submissions can be approved. I realized this when a fellow speaker posted a note on Twitter.  While there are potential implications here that are disruptive for event organizers and speakers, those are worthy of a separate debate. In fact, they are worthy of debate.

Today, I examined the PASS blog to look for an announcement. This what I saw.

2018-04-18 08_17_13-PASS Blog _ Community _ PASS - the Professional Association for SQL Server

The last update is from Mr. Fritchey, noting how much PASS wishes to express their affection for SQL Saturday. I appreciate the Board of Directors, most of whom are fellow speakers, may feel that way, but I’m not sure the organization believes this. Again, a separate discussion.

Early on in the life of SQL Saturday, a few people had to make decisions, but those decisions were publicized, with conversations with organizers and speakers. When fundamental changes were made, we announced changes publicly to ensure that others were informed. We setup a board of advisors to help ensure that change was made in a more open way, and to solicit feedback. Andy continued to provide updates later, even as PASS has declined to do so.

That doesn’t seem to have been done any longer. As I scan through the blog, I see no announcements of changes to SQL Saturday. The last posts were from 2016, when the 600 mile radius was announced and then a subsequent post explaining the reasoning.

Once again, almost two years later, PASS lacks transparency and vision. There are no discussions on slack that I find, NDA or otherwise, no emails I’ve received, no list of changes or updates.

What I would request is that you use this as an opportunity to improve governance at PASS. Examine the timeline of when this decision was made, the timeline, and look for opportunities to engage, or at least inform, the membership of PASS of changes. There should be milestones that require

  • an announcement of potential changes, NDA or public
  • a roadmap – is there a public one? Even one without dates.
  • a roadmap update
  • an announcement of an upcoming change
  • an announcement of the change, publicly
  • an update to the FAQ

I appreciate that there may not be funds for future changes, though there are obviously some still in progress. However, this is separate from funding. This is a process and governance issue that should be addressed.

It’s also simple common courtesy.

Sincerely,

Steve Jones

PASS member since 1999

Posted in Blog | Tagged , | 13 Comments