The Redgate Livestream
Tagsadministration auditing Azure backup Backup/Recovery blogging bloopers business Business Intelligence career career cell phone Cloud Computing computers conferences continuous delivery continuous integration data analysis database design databases Database Weekly dbatools deployment Dev Connections DevOps disaster recovery DLM encryption Filetable Friday Poll Git hardware high availability Humor life Linux mcm Microsoft misc monitoring networking NoSQL PASS patching Performance powershell presentations Redgate sabbatical security software development software development speaking SQL Bits SQL Connections SQL in the City SQLNewBlogger SQL Prompt SQL Saturday SQLServer sql server SQLServerCentral syndicated T-SQL T-SQL Tuesday testing TopTenSkills training travel tsqlt vacation version control virtualization windows work
This week I saw a post from Microsoft Tiger Team on the issues with backup compression and TDE databases. Apparently when they added compression, they didn’t test a few edge cases. Or a few regular cases, like WITH INIT. There is a possibility that your data could be corrupted, which is a major problem. I have found the SQL Server backup and restore capabilities to be very solid over the years, so this is disturbing. If you use TDE, apply SQL 2016 CU7 for RTM or SP1 CU4 ASAP. Test your restores, and be sure you know what you can recover and what you can’t. If Mr. Murphy has anything to say about it, you’ll have an issue soon, so test your restores.
Overall, SQL Server security is very good, but there are certainly issues with applications and devices that connect to SQL Server. You never know when some item that queries or alters data in SQL Server will cause issues. This week there were a couple notes from Bruce Schneier on FaceID and Bluetooth security, the latter of which has a vulnerability issue. Be sure you are aware of issues here to actually help protect yourself with your devices, but I was amazed to see this piece on infrared camera hacking. A POC on using light to jump air gaps is truly frightening. It seems that truly anywhere that we are processing data, we need to be thinking about security.
The last few weeks are especially scary for many people, particularly with the Equifax breach. I know they have released numbers on people’s data, but I’d assume that everyone who has ever had credit in the US has a potential issue. I know I am being careful with credit and watching for issues myself. You should, too, and demand better security from companies you do business with. We can improve systems, but it will take more pressure to get companies to put more emphasis on better software and security.
I remember seeing the Carnegie Mellon Software Capability Maturity Model (CMM) when I was in university. It was fascinating, and I was sure this was the way to write software. Across many jobs and many years, I realized that few organizations even try to become more efficient and capable in how they write software.
That’s changed a bit in the last 4-5 years as more organizations try to move to DevOps and become better at building software. Some do well, some just want to build software faster and not change the way they work.
In any case, Redgate has built a maturity model for Database DevOps. You can take the assessment now in a few areas and get an idea how you stack up against other companies.
I’ve been reading about and watching some of the experiments that people are conducting with Cortana and other types of devices. I’m not quite sure what I think, though I would like to experiment with Cortana at some point and see how useful a computer assistant might be. I have tried to make notes of ideas that I think could be helpful, but for some reason I haven’t really gotten the speech bug. I don’t talk to my computer or phone very often, with the exception being texting with my family while driving. In that case, speech is handy and much safer than anything else. My wife is the opposite; she really likes speech.
It seems many companies have been working to implement more assistant-like interactions with users. Amazon has the Echo and Dot devices with Alexa, Google has the Home speaker, and I’m sure more and more devices are coming over time. I don’t know how useful these are in multi-person situations where people are speaking, but it does seem that some people like the help from devices. Perhaps I’m too much of a DYI person, since I prefer to push buttons and make my own selections. However, it seems some people prefer to talk, even in busy, loud, crowded situations.
Many of us are data professionals, and even if we aren’t planning on using speech, there is the possibility that bots or other technologies might help us improve the way we work with databases or help our customers use the information we manage. I’ve seen some interesting interactions with Slack, including one company that allows limited developer access to systems through a Slack bot that handles some provisioning work, including raising and closing a ticket, without requiring human interactions.
I could even see some sort of alerting or “check this soon” interactions that might alert developers or administrators to take actions in a more casual way, perhaps doing so as part of a conversational queue of things to check rather than a list of items in a tool or a stack of email messages. An interactive assistant can be useful not only in providing information, but managing the flow to prevent overload. A personalized assistant might even do a better job or reminding you of items over time, perhaps anticipating future needs, or realizing some task hasn’t been completed in some time. Certainly the same type of bot or agent might perform work on your behalf.
This week, I’m wondering how many of you might find ways that a narrow, niche assistant might help you. Is there something in your work that an assistant might help you do? Or maybe there’s some way that a device might help you in your life, separately from a simple notification on your mobile or through email. Perhaps in conversation or in a semi-structured fashion. I’m sure many of you have a creative way that you might like to see technology assist you.
The Voice of the DBA Podcast
Cloud computing is growing fast. The leader, Amazon and Microsoft, continue to show more and more customers using their platforms every quarter. I do truly think that an increasing number of organizations are thinking cloud first for application development. That doesn’t mean all systems, nor does it mean that they won’t opt to bring their code back on premises, but they are considering the cloud.
Will this trend continue? For most of my life, I’ve seen trends come and go, but the popular ones continue to evolve and grow. I suspect that for now, cloud computing is attractive in many ways, not the least of which is a shift from a large up-front investment to a small pay as you go charge. Lots of finance people would rather see this as a way of investing in computing for their organization, knowing they can cancel and switch their investment elsewhere.
I think clearly the cloud will continue to grow for now, but will companies ever reverse the trend and move things back on premises? I wonder. It does seem that many people try to move to the cloud and struggle with the application changes, the unexpectedly high monthly costs, and the wandering performance levels of their systems. For executives used to knowing what the predictable level of performance was, even if poor, having systems (appear to) run randomly slower or faster can be maddening. I think some of this is the lack of understanding of cloud paradigms and programming practices for many developers as well as poor application architectures. But that’s the state of the world, so if systems don’t run well, why would you move away from your current data center?
The major cloud providers are providing lots of services and incentives, and certainly they are working hard to sell you computing resources. For some technologies, this makes perfect sense. I’d never set up an email server or system again, relying on a service like GMail or O365. I’ not sure I’d even want to host a web server again, expecting that something like App Services is just out there, and I can drop my code in a location and I’ll have a running website. And, of course, I love VSTS. If I were a developer starting a company, or a project, I’d consider using something like VSTS or a TeamCity suite with a cloud VCS. I mean, why invest and manage servers? I just want services.
There are issues with cloud computing, and certainly your flexibility can be limited. Security offerings are both better and worse, and pricing can be steep as you scale up. Of course, if you scale down you haven’t spent a bunch of money on extra hardware. Overall, I think there are some sweet spots for cloud computing, and even though some workloads don’t make sense, I’d expect that it’s worth considering the cloud first.
I do expect that some services, such as email, may never come back on premises. I also think some of the skills and tools we use will change, evolve, or disappear. For everyone, I’d suggest you experiment and learn some things about the cloud. Get a free account to experiment with, download emulators, or just watch a lot of presentations and content on the cloud. Whether you have plans to move or feel pressure now, I’m sure you’ll have to deal with the cloud at some point in the next ten years, so be prepared.
The Voice of the DBA Podcast
I saw a short video from Grant Fritchey on the Redgate channel that shows how to cerate a snapshot in SQL Compare. I think snapshots are great, and there are lots of uses (more on that later), but I do find sometimes I use the term and people are confused.
This is one of those overloaded terms. We have SAN snapshots, backup snapshots, database snapshots, and more that many of us deal with. All of these really mean a point in time view of that particular data set (image, machine, disk, database, etc.).
In SQL Server, we’ve had DB snapshots, which create a second database that is a copy as of a point in time of the original There are lots of restrictions on these, but they are useful for quick rollbacks during deployments, or point in time reporting, etc. Useful little creatures.
SQL Compare Snapshots
Many of you know SQL Compare as a tool that looks at two databases, gives you the differences in all objects, allows filtering, and also builds a deployment script to make the target database, DatabaseB, look like the source database, DatabaseA. In other words, if I add a table and view to the Source, and compare this with a destination, I’d see the new table and view as differences that I need to deploy.
That’s great when working with databases, however sometimes I don’t have control over a database and can’t be sure that it has a stable codebase (no changes). SQL Compare snapshots let me copy over the state of the database into a folder.
If I grab a random database and make a snapshot, the database looks like this:
and the snapshot looks like this:
Not quite the same. However, all the code is in the snapshot. It’s a binary file with all the code. If I use this as a target in SQL Compare:
I’ll see this if I haven’t changed the database.
Why Use Snapshots?
One of the main reasons I like to use snapshots is there are a stable view of code, like a tag or branch in a VCS, but they aren’t modified after created. They are a great way for me to capture the state of my code after a deployment (or before).
Later I can compare my snapshot with the database and detect changes. This is great for detecting production drift where someone might change production and I’m unaware.
I’m sure you could come up with other uses for a point in time view of your code.
I really like the dbatools project. This is a set of PowerShell cmdlets that are built by the community and incredibly useful for migrations between SQL Servers, but also for various administrative actions. I have a short series on these items.
One of the items that few DBAs think about often is compatibility. We tend to create databases with the default level, based on the instance. If we upgrade an instance, I’ve seen many people assume the level changes with the upgrade script (it does). However, if you restore a database, or attach one, the compatibility level might not match, and that could cause issues.
The dbatools project has thought of this, and they have a great cmdlet that can help you build checks of all your databases. The Test-DbaDatabaseCompatibility cmdlet will return the databases and whether they match the server level. You can also get more details if you like. Here’s a quick view. I can run:
Test-DbaDatabaseCompatibility -SqlServer .\SQL2016
This gives me a lot of databases on this instance. In fact, they scroll off the results quickly.
I don’t really worry about those databases that match. In fact, I expect that most databases do. My rule of DBA information is often that I want to only see those items that might cause me to take some action. Here, I only want those items that don’t match.
I can easily use filtering to do this:
Of course, I could simple wrap this in a function as well, add this in a pipeline that takes a list of instances, or more. For me, I’d want each instance to check itself with this cmdlet and record the results. That way I could periodically review the list and decide if I can actually can change any of these.
I once had a job interview with a large group of employees at a company. They were arranged around a table, with me at one end. The interview was a round robin affair, with someone asking me a question and me answering. They would take turns, going around the table for a couple hours. Not the easiest or hardest interview I’ve had, but one of the more interesting.
The first question from the gentleman to my left was, “What does ACID mean?”. My answer: “Isolated, Consistent, …”
I don’t have a problem taking a few seconds to think, but I have that internal clock, much like an American football NFL quarterback. Mine is not the 5 second one many QBs have, but I know that after about 20 seconds, people are wondering if I’ll answer at all. Or if I’m still focused on the interview. If I go 30 seconds, I might have blown the interview. What would you do here? Do you know the other terms? More importantly, can you answer at a level beyond the words of the acronym?
My response after 10 seconds or so was to admit I couldn’t remember the word. I could google for the meaning, but I did know this applies to relational database transactions, intending to ensure that we always have a known state for our data. We know that each transaction must full complete or be completely undone. We can’t have the classic issue of depositing money in one account and failing to remove it from another. We also cannot have other transactions interfere with other’s work. This means a transaction on a piece of data must complete before the next transaction can modify it. We also must ensure that if the system were to crash, our database could not restart with data in an unknown state. Therefore, in SQL Server, we write to the log first, ensuring the transaction is complete before we can be sure the data changes are hardened. There are more details you could add to my answer, but this is a core foundational part of relational databases.
I have never been asked this question in another interview, but I do think this is one of those core concepts that helps me understand and explain other parts of SQL Server, and even of how to build software. I think understanding this will help you answer other interview questions with more depth and knowledge. Hopefully, you’ll have a better answer than I did, with the actual words that make up the acronym.
The Voice of the DBA Podcast
I was reminded of the ways I’ve looked for jobs by a post from the Job Headhunter about library time. Rather than looking for what jobs are available, the recommendation is to take a targeted approach of using the library as a place to focus your energies and find the job that you really want, or at least type of job you want.
Far too many of us look for jobs when we need them, often feeling financial pressures to accept the first offer we get. The pace of hiring usually prevents us from considering multiple offers, as the timing of interviews may not align. We make decisions for expediency, more than any other reason.
We also usually fall from job to job, perhaps accepting a position that doesn’t quite suit us, or keeps us in the same position when we’d rather move. Again, time and financial pressures can force bad decisions. There’s also the fact that despite the training and schooling many of us have gone through, we don’t usually think about other options in the hectic, day-to-day chaos of our lives.
The Job Headhunter recommends spending time in the library researching and thinking about different jobs. Certainly you could do this from the comfort of your home with technology, but I agree with the premise. The library can help you focus without the distractions at home. You can spend time thinking about your life, your goals, your interests, your skills, and more. Come up with a plan that drives your career forward. Certainly your research might give you some unexpected answer, but likely this will help you focus on some intersection of your skills and interests.
Life is short, and there are opportunities out there for you. Choose the ones you want, hone your skills, and actively pursue the employment that fits in your life. The work, the company, the location, whatever is most important to you.
The Voice of the DBA Podcast
We’re just a month away from Visual Studio Live in Anaheim where I’ll be speaking along with many others. Join me today, register with ANSPK14 and save $400. This gives you the 4-day, all access conference package for a reduced rate.
California coding, with experts in DevOps, Cloud Computing, Databases, Visual Studio, Mobile, and more. This is a multi-technology conference, and you can learn about all those different technologies where you need to work.
The Agenda is packed, and I’ll be covering security and testing for SQL Server, but I’m looking forward to checking out some Azure, VS and VSTS tricks, Lambdas and more.
If you’re looking to learn some SQL Server and more this fall, think about joining me in California for a warm break. Register today and enjoy some fall learning in LA.
I’ve been playing with PowerShell here and there, learning a few things. As I use the dbatools project, or
I saw a tweet from Rob Sewell on using VS Code for PoSh, which I’ve tried, but it felt cumbersome and hard. I didn’t spend much time on it before dropping back to the ISE, which I was comfortable in.
However, Rob’s tweet mentioned a post from Mike Robbins on configuring VSCode, so I decided to give it another try. It’s not that the ISE doesn’t work well, but I keep hearing people talking about VSCode, which has git support, and so I thought this might be worth trying.
I won’t talk about anything in Mike’s post. You can try that out. I already at the PoSh extension, but it was upgraded since I first used it, and now pops the integrated console when I open a .ps1, which makes it like the ISE.
I don’t remember that from before, and I certainly like that. I also skipped the VSCode-icons in place of VSCode Great Icons. I mean, I want to be great at this stuff.
The one thing I did do was change the settings from Mike to set a new zoom level. Zero is a little low for my old eyes, and 2 (where I’ve been working) is too big here. I chose 1.
The main thing you want to do with an editor is write code. So, I decided to write some simple code. I have a few things to do, but I decided to start with something simple. I wrote a few lines of PoSh. This isn’t terribly useful, but I wanted to play with the editor and the debugger.
Now, in the ISE, I do get some intellisense and parameters. The parameters look better, but the screen capture ghosted them out a bit.
In VSCode, this actually looks better. First, I enter a cmdlet name and see some basics.
After I’ve entered it, I see more details as I work with it.
Running code works OK. I can hit F5 and the code runs. Not ideal, as the function keys make my hands move off the keyboard. That isn’t different from the ISE. I definitely wish we had shortcuts, and I know there are extensions to help with this. I need to play with some.
The debugging, however, is much easier. More in line with VS and other debuggers. Set a breakpoint, F5, and I have real debugger controls if I want them (as well as F10, F11).
I can see variables in the debug window, or in the main window if I hover. This feels a bit cleaner than the ISE.
There still are some issues. F10 seems to disappear sometime. I’ve tried to remove breakpoints and they don’t always remove. However, those are pretty minor items in the scheme of things.
Overall, I’ve changed a bit of my view of VS Code here. It does seem to have improved since the last time I tried PoSh in here. Maybe I’ll give it another chance with other languages as well.