Assigning a Default Schema to a New User–#SQLNewBlogger

I had to test something for a customer, and as a part of this there as a need to have a different default schema for a user. Since this isn’t something that I (or many people) do often, I wanted to make a note about how to do this.

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers.

The Scenario

A user in a database needed to access certain objects, which were going to be located in a separate schema. There was a possibility that there would be objects in the new schema and in the dbo schema with the same name, so the concern was with developers writing code that might access the wrong object.

The Solution

When you add a user, this is a simple parameter as part of the CREATE USER DDL. In this case, you use the DEFAULT_SCHEMA parameter. I didn’t look this up at SQL Prompt hinted me to the WITH and the parameter.

In my case, we wanted to add a new user, which we will call APIUser and assign them to the WebAPI schema. We use this code:

CREATE USER APIUser FOR LOGIN APIUser WITH DEFAULT_SCHEMA=WebAPI
GO

Note a couple things. First, the schema name isn’t quoted. It’s treated as an identifier. Second, the WITH is used to add this parameter to the statement. Once we do this, if they user does not include a schema in an object reference, like the one below, they will still get data from the object in the WebAPI schema.

SELECT * FROM location

SQL New Blogger

This was a minor part of something else I was doing. In this case, setting up a different scenario, but I captured this slice of code, edited the names slightly, and then pasted them in here.

Outside of the work I was doing, the sketch of these notes took about 2 minutes, and then the entire post was < 10 minutes.

You can do this.

Posted in Blog | Tagged , , | 1 Comment

T-SQL Tuesday #164: Code that makes you feel

tsqltuesdayThe invitation this month is from Erik Darling, and it’s a neat one. I like this thought, asking us to find code that impressed us or made us feel something. I tend to look at this as positive, but it could be negative.

In any case, I was on vacation from 1 Jul to 9 Jul, out of touch with the world and unwired. So I’m doing this as a quick post by the seat of my pants. I might have to come back and make a second post in the future.

Changing the way I think about T-SQL

I’m a decent T-SQL developer. Not amazing, or great even, but I am effective. I’ve learned a lot over the years and I’ve been able to get things done for my employers. I often look at other’s code and I try to improve how I view problem solving. I’ve learned a lot from Itzik and others over the years, though I have to admit that many of us solutions go over my head. I’m just not in those spaces where I need complex coded solutions very often.

That being said, years ago I got an article from Jeff Moden on the tally table. I hadn’t used this, and was fascinated. I know Itzik had written about numbers tables early on, but it hadn’t caught my attention. However, in a follow-up, Jeff wrote about a splitter function, which would use the tally table to split strings efficiently. This is the function (credit to Jeff in his article):

CREATE FUNCTION [dbo].[DelimitedSplit8K] --===== Define I/O parameters (@pString VARCHAR(8000), @pDelimiter CHAR(1)) --WARNING!!! DO NOT USE MAX DATA-TYPES HERE! IT WILL KILL PERFORMANCE! RETURNS TABLE WITH SCHEMABINDING AS RETURN --===== "Inline" CTE Driven "Tally Table" produces values from 1 up to 10,000... -- enough to cover VARCHAR(8000) WITH E1(N) AS ( SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 ), --10E+1 or 10 rows E2(N) AS (SELECT 1 FROM E1 a, E1 b), --10E+2 or 100 rows E4(N) AS (SELECT 1 FROM E2 a, E2 b), --10E+4 or 10,000 rows max cteTally(N) AS (--==== This provides the "base" CTE and limits the number of rows right up front -- for both a performance gain and prevention of accidental "overruns" SELECT TOP (ISNULL(DATALENGTH(@pString),0)) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) FROM E4 ), cteStart(N1) AS (--==== This returns N+1 (starting position of each "element" just once for each delimiter) SELECT 1 UNION ALL SELECT t.N+1 FROM cteTally t WHERE SUBSTRING(@pString,t.N,1) = @pDelimiter ), cteLen(N1,L1) AS(--==== Return start and length (for use in substring) SELECT s.N1, ISNULL(NULLIF(CHARINDEX(@pDelimiter,@pString,s.N1),0)-s.N1,8000) FROM cteStart s ) --===== Do the actual split. The ISNULL/NULLIF combo handles the length for the final element when no delimiter is found. SELECT ItemNumber = ROW_NUMBER() OVER(ORDER BY l.N1), Item = SUBSTRING(@pString, l.N1, l.L1) FROM cteLen l ;

Now, note this is not the original code, but updated and improved code. I love that Jeff has maintained this.

What I found great is how Jeff approached the problem. This is limited to 8k strings, but that’s for performance reasons. One could easily enhance this to be larger if needed. I also like the tally table is quickly generated from simple code that gives us 10,000 numbers.

I also found the simplicity of the substring and the charindex to be something I think I not only understand, but could have written.

Could have. Certainly didn’t and might not have. This is great code that’s been helpful to me over the years in places where I wanted to break up code. I’ve used this in a number of demos for clients and I’ve referred people to this over the years as well.

Posted in Blog | Tagged , , | 1 Comment

Bare Metal

At the first job I had as a DBA, I had to build a new server. This was in the days of SQL Server 4.2, and I was combination DBA, sysadmin, and general help desk at a small company. With a software developer consultant, we ran some tests on various machines and then ordered a collection of parts from Compaq. Back in this time, they would only customer parts of the server. We unpacked our boxes with the server, extra drive bay, various SCSI drives, and extra RAM. We assembled and tested the machine and eventually put it into production.

In the years since then, I got out of the hardware business and left that to others. For awhile I worked in organizations with IT staff dedicated to building machines, but at some point we stopped doing that. The growth of VMWare and other hypervisors changed the paradigm for most organizations. For more than a decade, all the servers I’ve connected to are virtual machines running on hardware that my employer or a cloud provider owns and manages as a node in a cluster.

Early on there were concerns about the overhead of using hypervisors and virtual machines. When many early workloads moved to VMs, lots of organizations left database servers on bare metal to squeeze every bit of performance out of the system that was possible. Over the years, improvements in hypervisors as well as the software used to connect storage and networks together seem to have rendered those conversations obsolete.

Or have they?

Today I’m wondering if any of you still have servers on bare metal. Are there systems that you continue to install the OS and database server software directly on the hardware? Or are there systems where you still ensure that there is only one VM, the database server VM, on a hardware node?

The cloud has changed a lot of these conversations, since everything is a VM. Even many local data centers will rent you a VM, something that wasn’t possible when we first moved SQL Server Central out of a friend’s basement into a co-location facility. Back then we owned our servers and I installed them in data center racks. I would like to think we’d have moved our site to the cloud in some way over time, though maybe not. Maybe I’d still be managing a couple servers in a room in Denver.

Are any of your organizations still running legacy systems in the same way you would have in 1999? Let us know today.

Steve Jones

Listen to the podcast at Libsyn, Stitcher, Spotify, or iTunes.

Posted in Editorial | Tagged , | Comments Off on Bare Metal

Volunteering at Work: Teaching Others

I give a few talks on career topics, and one of these is Branding Yourself for a Dream Job. In the talk, I sometimes tell a story in the volunteering section. I wanted to summarize that here, as a way to get you to think how you can do more at work and grow your brand substantially.

A Young Worker

I was a newly hired worker on a network team at a power plant. We supported everything from the computer network to servers to desktops to sensor devices. There were hundreds of people, over 1000 nodes, and just 5 of us. Needless to say we were busy, we had to think on our feet, and we built a lot of protocols to help us.

We were the help desk and fielded a lot of calls from people, handling everything from password resets to reinstalling Windows to helping people work with various applications. I created a bunch of simple utilities (exes and batch scripts) that we kept on a share to help us solve common problems quickly. That was very helpful because we had a lot of repeated work.

There was one person, I’ll call her Sarah, that called us regularly. She was an assistant in another department dealing with supplies of some sort. She had a lot of basic questions on using WordPerfect (remember this? ) and Lotus 1-2-3, often moving information to and from places and producing reports for her boss.

We got the same questions over and over again, at least weekly if not daily. It got to the point where my entire team was so annoyed that we’d dodge her calls. This was 1991 and we had caller ID on the internal system. When she called, we’d pick up the call and immediately hand the phone to a coworker to deal with.

This wasn’t the only person that created stress for our team, but this was the worst one. We were annoyed, and she was annoyed with us. Our feelings showed through and we gave her the minimum level of service, as slow as we could.

Making a Difference

I was as annoyed as everyone, but like with my utilities, I was looking for a way to make our team work more smoothly.

At the time Windows was new and many people were intimidated with it. We had plenty of software that actually ran in a command window, like WordPerfect, while others ran as real windows (Lotus 1-2-3).

I decided to try and fix things. I walked over to a remote building where Sarah worked. I’ve been there before to help with something, and likely my annoyance showed as she wasn’t thrilled to have me at her desk unannounced.

I told her that I wanted to try and help her with some of the challenges with her position. I emphasized that a lot of people were struggling with these applications and I was trying to find ways to help them. I offered to bring her lunch and work through some things. At the time we were a remote station, and there was a Subway franchise that came in each day to do lunch. I said I’d buy sandwiches for a week.

She agreed and I took her order. That simple empathy helped disarm her anger at the computer and my group.

I bought lunch the next day and went over to Sarah’s desk. I watched her work, saw struggles, and then worked out some things to help her, created some notes she could keep around, and helped educate her.

It took a week or two, but at the end of the time, she could self-service a lot of her issues without calling us.

The Outcome

First, I reduced the workload on my group. Everyone appreciated less work, and more importantly, less annoying work.

Second, I empowered Sarah. She was able to get more work done, more quickly and felt more capable in her position. Her boss was happier with her, and I felt great.

Lastly, because she controlled supplies, I was able to get floppy disks and other small things easily when I called her. My boss couldn’t because she was annoyed with the way he and others had treated her.

The Investment

This was an investment for both of us. We both invested time, giving up our lunch hours. I think this was a week or two, but not longer.  I also invested a bit of money, perhaps $5-6 a day back then, to provide lunch. Not nothing, but not a lot.

However, the investment payed off. I kept a note of this and reminded my boss at my review. I got a raise that year, not solely on this, but since I was competing with others for money, this was something that helped me stand out.

This also got me a good review from Sarah’s boss, which the overall manager of the station heard. That opened up opportunities for me as a developer position in the corporate headquarters opened up and I got recommendations from all three managers.

Look for Opportunities

These days many of us work remotely part of the time. We may not see our coworkers, and perhaps lunch isn’t easy, but we could invest time to solve something or train someone. Either online, written, or in person. A little investment to make your job (or others’) easier can pay off for the company.

It can also pay off for your personally if you document this and bring it up with your boss. It’s also a great story for interviews.

This is extra work. This is unpaid work. However, many people in many fields invest in their learning and advancement. From mechanics to doctors, people give up some of their time to grow their careers. You can do this as well.

Posted in Blog | Tagged , | Comments Off on Volunteering at Work: Teaching Others