SQL Rally: Performance Tuning Abstract

October 28, 2010 at 8:00 am (PASS) (, , , , , , )

I get the call, you get the call, everyone gets the call. “Hey, my app/procedure/query/report is running slow.” Now what do you do? You go to my full day session at SQL Rally, that’s what. Assuming you vote for it.

I didn’t post the abstract I submitted for the SQL Rally before because I thought that it would be redudant. However, since it’s not right off the voting page (unless they updated it since I voted), if you’re interested, here’s what I thought I would do for a day. If it sounds good to you, please go here and vote for it.

One of the most common problems encountered in SQL Server is the slow running query. Once a query is identified as performing poorly, DBAs and developers frequently don’t understand how to diagnose the problem and often struggle to fix the problem. This one day seminar focuses exclusively on these two topics. Attendees will learn how to identify the queries that are performing badly and learn how to fix them. We will start by learning how to gather performance metrics, both server and query metrics, using tools available directly from Microsoft such as performance monitor, DMVs and Profiler. From there we’ll move into learning how the optimizer works and how it uses statistics to determine which indexes and other database objects can assist the performance of a query. The session takes considerable time to show exactly how to generate and read execution plans, the one best mechanism for observing how the optimizer works. We’ll then look at other DMVs that can assist you when performance tuning queries. With all this knowledge gathered, we’ll move into looking at common performance problems, how they evidence themselves in the metrics and execution plans, and how to address them. Finally, we’ll explore advanced methods for solving some of the more difficult query performance problems introducing such concepts as query hints, plan guides and plan forcing. Through all of this, best practices and common techniques will be reviewed. Attendees will go home with a working knowledge of query performance tuning, a set of methods for identifying poorly performing queries, scripts to assist in these processes and the knowledge of how fix performance problems in their own systems.

To see the other sessions go here:
BI
DBA
Developer
Misc

Although I would prefer that you voted for me, it’s more important that you vote at all (same thing as in real life). Please go here and select the sessions that you want to see.

Permalink 1 Comment

24 Hours of PASS: Summit Preview

August 11, 2010 at 8:58 am (PASS, SQL Server 2008, SQLServerPedia Syndication, TSQL) (, , , , , , , )

Registration is open for the second 24 Hours of PASS this year. This one is going to be a preview of the Summit itself. So all the sessions are tied, in some manner, to sessions being given at the summit.Here’s a link to go and register.

I’m very excited to be able to say that I’ll be presenting in this 24HOP. One of my presentations at the Summit this year is Identifying and Fixing Performance Problems Using Execution Plans. It covers pretty much what it says, methods for fixing performance problems by exploring the information available within execution plans. But, how do you know you have a performance problem? That’s where my preview session comes in. Identifying Costly Queries will show you several ways to gather metrics on your system so that you can understand which queries are causing you the most pain. Once you know which queries need tuning, you can use execution plans to tune them. Whether you’ll be attending the PASS Summit or not, and whether or not you’ll go to my session once you’re there, I think this 24HOP session will be useful to help you understand where the pain points are within your own systems. I hope you’ll attend.

More importantly though, check out all the other great sessions. This is an excellent collection of presenters and presentations. For anyone who has ever said “PASS doesn’t do anything for me,” I want you especially to take a look at the amazing training opportunities being offered by PASS, for free. The volunteers that run PASS do amazing things and this is just one of them. Take advantage of this opportunity and, hopefully, recognize that PASS is doing things for you. This just barely scratches the surface of all that PASS offers.

Permalink 3 Comments

When did this statement start?

July 28, 2009 at 10:19 am (SQL Server 2005, SQL Server 2008, SQLServerPedia Syndication, TSQL) (, , , )

UPDATE: This post is incorrect. Adam nailed it in the comments. I explain my mistake here.

A question came up over at SQL Server Central where someone was wondering if it was possible to know when a given statement within a batch started. Immediately I thought, oh yeah, that’s easy, use the sys.dm_exec_requests dynamic management view (DMV). Done.

Wrong. The original poster pointed out that I had assumed that the values present in the DMV represented statement level values, but they show the batch. While the DMV shows a start_time, that time is the start of the current batch, not the statement within the batch. Now the question was, where else might I get this data? I next tried sys.dm_exec_sessions because it has the last_request_start_time value. Sure enough this worked.

Don’t believe me? This time I tested it. First, I set up a batch that ran a few longish queries:

BACKUP DATABASE AdventureWorks
TO DISK = 'c:\bu\ad.bak'
WITH INIT;
DBCC CHECKALLOC(AdventureWorks);
DBCC CHECKDB(AdventureWorks);
DBCC CHECKCATALOG(AdventureWorks);
BACKUP DATABASE AdventureWorks
TO DISK = 'c:\bu\ad.bak'
WITH INIT; 

Then I wrote a little loop that would insert data from both DMV’s, once every second for a minute:

SET NOCOUNT ON;
TRUNCATE TABLE dbo.dmR;
TRUNCATE TABLE dbo.dmS;
DECLARE @CheckTime DATETIME = DATEADD(ss,60,GETDATE())
 ,@CurrTime DATETIME = GETDATE();
WHILE @CurrTime < @CheckTime
BEGIN
INSERT INTO ... SELECT ... FROM sys.dm_exec_sessions
INSERT INTO ... SELECT ... FROM sys.dm_exec_requests
WAITFOR DELAY '00:00:01';
SET @CurrTime = GETDATE();
END

Then it was a question of hitting the go button. The results are pretty clear. If you look at sys.dm_exec_requests data, you’ll see the start_time doesn’t change, but, the work being performed, visible in the command and plan_handle columns changes. You can also see the statement_start_offset and statement_end_offset changing. These values will allow you to pull data in combination with the sys.dm_exec_sql_text() dynamic management function.

 start_time             command         statement_start_offset statement_end_offset plan_handle
2009-07-28 10:42:09.447 BACKUP DATABASE 0                      138                  0x060001007D931719B8C0BF0D000000000000000000000000
2009-07-28 10:42:09.447 DBCC            140                    206                  0x060001007D931719B8C0BF0D000000000000000000000000
2009-07-28 10:42:09.447 DBCC            140                    206                  0x06000F00AF16F10FB8C04D0D000000000000000000000000
2009-07-28 10:42:09.447 DBCC            208                    268                  0x060001007D931719B8C0BF0D000000000000000000000000

But, as you can see, the start_time stays the same. Looking at the last_request_start_time and last_request_end_time in the sys.dm_exec_sessions DMV, you can see these values changing in conjunction with the changing values in sys.dm_exec_requests.

last_request_start_time last_request_end_time
2009-07-28 10:35:21.500 2009-07-28 10:35:51.170
2009-07-28 10:42:09.447 2009-07-28 10:35:51.170 

I’ll leave at as an excercise for the reader (or for another post for myself) to put together the DMV’s and DMF’s that would allow you to combine this data and show exactly which statement was being executed at which moment. The point, it is possible to determine exactly which statement is currently being executed within a batch process, and when that statement started.

Permalink 5 Comments

More Refinements on the Missing Indexes Query

March 2, 2009 at 8:03 am (SQL Server 2005, SQL Server 2008, TSQL) (, , , )

Greg Larson posted a really nice query to find worst performing stored procedures in your system. He put in all kinds of attributes to make it customizable, changing the definition of “worst” to different measures,etc. Great query in general. In it he linked to sys.dm_exec_plan_attributes and got the db_id attribute. Duh! So instead of spelunking through the XML to retrieve the database name, I can pull the db_id and use the DB_NAME function. Cleans things up considerably. Thanks Greg.

Here’s the cleaned up code:
WITH XMLNAMESPACES ('http://schemas.microsoft.com/sqlserver/2004/07/showplan'
AS sp)
SELECT DB_NAME(CAST(pa.value AS INT))
,s.sql_handle
,s.total_elapsed_time
,s.last_execution_time
,s.execution_count
,s.total_logical_writes
,s.total_logical_reads
,s.min_elapsed_time
,s.max_elapsed_time
--,s.query_hash
,p.query_plan
,p.query_plan.value(N'(sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/@Table)[1]',
'NVARCHAR(256)') AS TableName
,p.query_plan.value(N'(/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/@Schema)[1]',
'NVARCHAR(256)') AS SchemaName
,p.query_plan.value(N'(/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/@Impact)[1]',
'DECIMAL(6,4)') AS ProjectedImpact
,ColumnGroup.value('./@Usage', 'NVARCHAR(256)') AS ColumnGroupUsage
,ColumnGroupColumn.value('./@Name', 'NVARCHAR(256)') AS ColumnName
FROM (SELECT TOP 20
s.sql_handle
,s.plan_handle
,s.total_elapsed_time
,s.last_execution_time
,s.execution_count
,s.total_logical_writes
,s.total_logical_reads
,s.min_elapsed_time
,s.max_elapsed_time
--,s.query_hash
FROM sys.dm_exec_query_stats s
ORDER BY s.total_elapsed_time DESC
) AS s
CROSS APPLY sys.dm_exec_query_plan(s.plan_handle) AS p
CROSS APPLY p.query_plan.nodes('/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/sp:ColumnGroup')
AS t1 (ColumnGroup)
CROSS APPLY t1.ColumnGroup.nodes('./sp:Column') AS t2 (ColumnGroupColumn)
CROSS APPLY sys.dm_exec_plan_attributes(s.plan_handle) pa
WHERE p.query_plan.exist(N'/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan//sp:MissingIndexes') = 1
AND pa.attribute = 'dbid'

Permalink 4 Comments

Refining the Missing Index Data

February 18, 2009 at 11:30 am (SQL Server 2005, SQL Server 2008, SQLServerPedia Syndication, TSQL) (, , , )

In my previous post I showed how you could determine if a query had missing index information, but it was difficult to get meaningful data out because I didn’t know XQuery well enough. I spent some more time refining the query and here are the results. This new query will show some basic performance information gathered from the aggregate data in sys.dm_exec_query_stats. It combines this with the full data pulled from the Missing Indexes element in the XML of the execution plan. I’ve got it listing all the recommended columns and grouping. So this means that the performance data is repeated in order to allow for the full listing of groups & columns. It seems to work well.

A couple of interesting points. My purpose is to provide a short-cut method of identifying possible indexing needs for systems that run lots of dynamic or ad hoc queries. In my own environment I’m mainly talking about either Microsoft CRM or nHibernate, but the rules would apply to almost any other client-side query engine. I wanted to list the database name as part fo the query. DBID is included with the sys.dm_exec_query_plan, but unfortunately, when a query is prepared or ad hoc, this value is null. Luckily enough the data is available in the execution plan itself. That’s going to be something to watch for in the future when using these execution plans.

Here’s the query:
WITH XMLNAMESPACES ('http://schemas.microsoft.com/sqlserver/2004/07/showplan' AS sp)
SELECT p.query_plan.value(N'(sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/@Database)[1]', 'NVARCHAR(256)') AS DatabaseName
,s.sql_handle
,s.total_elapsed_time
,s.last_execution_time
,s.execution_count
,s.total_logical_writes
,s.total_logical_reads
,s.min_elapsed_time
,s.max_elapsed_time
,p.query_plan
,p.query_plan.value(N'(sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/@Table)[1]', 'NVARCHAR(256)') AS TableName
,p.query_plan.value(N'(/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/@Schema)[1]', 'NVARCHAR(256)') AS SchemaName
,p.query_plan.value(N'(/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/@Impact)[1]', 'DECIMAL(6,4)') AS ProjectedImpact
,ColumnGroup.value('./@Usage', 'NVARCHAR(256)') AS ColumnGroupUsage
,ColumnGroupColumn.value('./@Name', 'NVARCHAR(256)') AS ColumnName
FROM sys.dm_exec_query_stats s
CROSS APPLY sys.dm_exec_query_plan(s.plan_handle) AS p
CROSS APPLY p.query_plan.nodes('/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan/sp:MissingIndexes/sp:MissingIndexGroup/sp:MissingIndex/sp:ColumnGroup') AS t1 (ColumnGroup)
CROSS APPLY t1.ColumnGroup.nodes('./sp:Column') AS t2 (ColumnGroupColumn)
WHERE p.query_plan.exist(N'/sp:ShowPlanXML/sp:BatchSequence/sp:Batch/sp:Statements/sp:StmtSimple/sp:QueryPlan//sp:MissingIndexes') = 1
ORDER BY s.total_elapsed_time DESC

Permalink 24 Comments