Friday, March 23, 2012

optimize queries with unexpected results

Hi,
This is one for the MVP’s. I am trying to optimize queries but I am findin
g
un-expected results. I am using sql 2000.
For example, I have read in several sites that referencing objects with
qualified owner names is faster. The truth is that I am experiencing all the
contrary. Am I doing something wrong in my test or this is just a rumour and
no objects should be qualified?
I am running both next queries in my own desktop machine, and clearing the
chache every time I run the query as per below scripts . As I understand tha
t
the workload in my machine might vary (not really doing nothing during the
test) I am taking at least 10 runs of each script and take the average
results.
I would expect better results for script 2 that for one (test show the
contrary), can anybody explain why? I am examining several counters, but
would be happy enough if just logical reads would show better results. With
this results … how can I face the developers and tell them this and other
supposed coding “advantages”, or shall I just forget about reality and t
amper
the results in a Mendel fashion :-) ?
Script1:
SET STATISTICS IO on
SET STATISTICS TIME on
--above to lines just run once in each window
DBCC DROPCLEANBUFFERS
dbcc freeproccache
SELECT TOP 1000 *
FROM Table1 , Table2
where Table1.Idfield= Table2.Idfield
Script2:
SET STATISTICS IO on
SET STATISTICS TIME on
--above to lines just run once in each window
DBCC DROPCLEANBUFFERS
dbcc freeproccache
SELECT TOP 1000 *
FROM dbo.Table1 , dbo.Table2
where dbo.Table1.Idfield= dbo.Table2.Idfield
Thanks in advance, your help is much appreciated,
TristanIn addition to the [procedure cache] which contains execution plans,
there is also the [buffer cache] which contains previously read data and
index pages. This can have an even greater impact on query duration,
especially if you are executing the same query back to back. Since this is a
local instance of SQL Server, you may want to stop / re-start the service
prior to each test.
http://msdn.microsoft.com/library/d...br />
1zu4.asp
Also, in Query Analyzer, choose the menu option Query.. Show Execution
Plan. After the query comples, hovering the mouse over each step will reveal
information like whether a index s, index scan, table scan, etc. were
performed and the actual I/O and CPU cost. If (after completely clearing all
caches) one run has basically the same I./O cost as another run, then there
are probably other environmental factors that influenced the outcome.
Performance on a desktop PC will be more variable than on a production
server, becuase there are so many other background services and processes,
such as anti-virus, running.
"Tristan" <Tristan@.discussions.microsoft.com> wrote in message
news:1D180C57-9AFF-458D-9529-9560FB154B73@.microsoft.com...
> Hi,
> This is one for the MVP's. I am trying to optimize queries but I am
> finding
> un-expected results. I am using sql 2000.
> For example, I have read in several sites that referencing objects with
> qualified owner names is faster. The truth is that I am experiencing all
> the
> contrary. Am I doing something wrong in my test or this is just a rumour
> and
> no objects should be qualified?
> I am running both next queries in my own desktop machine, and clearing the
> chache every time I run the query as per below scripts . As I understand
> that
> the workload in my machine might vary (not really doing nothing during the
> test) I am taking at least 10 runs of each script and take the average
> results.
> I would expect better results for script 2 that for one (test show the
> contrary), can anybody explain why? I am examining several counters, but
> would be happy enough if just logical reads would show better results.
> With
> this results . how can I face the developers and tell them this and other
> supposed coding "advantages", or shall I just forget about reality and
> tamper
> the results in a Mendel fashion :-) ?
> Script1:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM Table1 , Table2
> where Table1.Idfield= Table2.Idfield
> Script2:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM dbo.Table1 , dbo.Table2
> where dbo.Table1.Idfield= dbo.Table2.Idfield
> Thanks in advance, your help is much appreciated,
> Tristan
>|||The first thing you should do is to look at the query plan for each of these
scenarios and see what the difference is..
Wayne Snyder MCDBA, SQL Server MVP
Mariner, Charlotte, NC
I support the Professional Association for SQL Server ( PASS) and it''s
community of SQL Professionals.
"Tristan" wrote:

> Hi,
> This is one for the MVP’s. I am trying to optimize queries but I am find
ing
> un-expected results. I am using sql 2000.
> For example, I have read in several sites that referencing objects with
> qualified owner names is faster. The truth is that I am experiencing all t
he
> contrary. Am I doing something wrong in my test or this is just a rumour a
nd
> no objects should be qualified?
> I am running both next queries in my own desktop machine, and clearing the
> chache every time I run the query as per below scripts . As I understand t
hat
> the workload in my machine might vary (not really doing nothing during the
> test) I am taking at least 10 runs of each script and take the average
> results.
> I would expect better results for script 2 that for one (test show the
> contrary), can anybody explain why? I am examining several counters, but
> would be happy enough if just logical reads would show better results. Wit
h
> this results … how can I face the developers and tell them this and othe
r
> supposed coding “advantages”, or shall I just forget about reality and
tamper
> the results in a Mendel fashion :-) ?
> Script1:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM Table1 , Table2
> where Table1.Idfield= Table2.Idfield
> Script2:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM dbo.Table1 , dbo.Table2
> where dbo.Table1.Idfield= dbo.Table2.Idfield
> Thanks in advance, your help is much appreciated,
> Tristan
>|||Hi JT / Wayne,
I was allready looking with execution plans plans and where the exactly the
same, statistics showed brutal differences, but the worst was with the logia
l
reads, it just didnt make sense a diference of 1000 between one method and
the other. As sometimes happends in this job guys, closed the window and
opened another one and the error disapeared.
Anyway, thanks a lot guys.
Hi JT, FYI from BOL "Use DBCC DROPCLEANBUFFERS to test queries with a cold
buffer cache without shutting down and restarting the server.", dont know if
there are any other methods to clear more buffer than the ones I ussed,
though u r right to advise not to test in PC, sorry but prod and UAT are
fried with users :-)
"JT" wrote:

> In addition to the [procedure cache] which contains execution plans,
> there is also the [buffer cache] which contains previously read data and
> index pages. This can have an even greater impact on query duration,
> especially if you are executing the same query back to back. Since this is
a
> local instance of SQL Server, you may want to stop / re-start the service
> prior to each test.
> http://msdn.microsoft.com/library/d... />
a_1zu4.asp
> Also, in Query Analyzer, choose the menu option Query.. Show Execution
> Plan. After the query comples, hovering the mouse over each step will reve
al
> information like whether a index s, index scan, table scan, etc. were
> performed and the actual I/O and CPU cost. If (after completely clearing a
ll
> caches) one run has basically the same I./O cost as another run, then ther
e
> are probably other environmental factors that influenced the outcome.
> Performance on a desktop PC will be more variable than on a production
> server, becuase there are so many other background services and processes,
> such as anti-virus, running.
>
>
> "Tristan" <Tristan@.discussions.microsoft.com> wrote in message
> news:1D180C57-9AFF-458D-9529-9560FB154B73@.microsoft.com...
>
>|||There can be probably "wrong" index use problem, look ath te query analyzer,
which index is really used in
query and probably you must force the
select INDEX hint
Wayne Snyder wrote:
>The first thing you should do is to look at the query plan for each of thes
e
>scenarios and see what the difference is..
>
>[quoted text clipped - 45 lines]
Message posted via http://www.webservertalk.com|||Tristan,
The execution plans of the two queries should be the same. If they are
not, then that is a different issue.
If you use TOP 1000 and the join between Table1 and Table2 produces more
than 1000 rows, then the results can differ between runs. That is not a
good test situation.
The difference between a qualified and unqualified object is only
relevant in the query compilation phase. The potential performance
difference will not show up in the logical reads.
Gert-Jan
Tristan wrote:
> Hi,
> This is one for the MVP’s. I am trying to optimize queries but I am find
ing
> un-expected results. I am using sql 2000.
> For example, I have read in several sites that referencing objects with
> qualified owner names is faster. The truth is that I am experiencing all t
he
> contrary. Am I doing something wrong in my test or this is just a rumour a
nd
> no objects should be qualified?
> I am running both next queries in my own desktop machine, and clearing the
> chache every time I run the query as per below scripts . As I understand t
hat
> the workload in my machine might vary (not really doing nothing during the
> test) I am taking at least 10 runs of each script and take the average
> results.
> I would expect better results for script 2 that for one (test show the
> contrary), can anybody explain why? I am examining several counters, but
> would be happy enough if just logical reads would show better results. Wit
h
> this results … how can I face the developers and tell them this and othe
r
> supposed coding “advantages”, or shall I just forget about reality and
tamper
> the results in a Mendel fashion :-) ?
> Script1:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM Table1 , Table2
> where Table1.Idfield= Table2.Idfield
> Script2:
> SET STATISTICS IO on
> SET STATISTICS TIME on
> --above to lines just run once in each window
> DBCC DROPCLEANBUFFERS
> dbcc freeproccache
> SELECT TOP 1000 *
> FROM dbo.Table1 , dbo.Table2
> where dbo.Table1.Idfield= dbo.Table2.Idfield
> Thanks in advance, your help is much appreciated,
> Tristan

No comments:

Post a Comment