2

I have an application that is processing lots of items from an SQL database. It runs along fine and then after about 4 million items processed it starts hanging up.

Using SQL Server Profiler, I traced the problem to a query that jumps from a duration of around 1,000 to a duration of 2,000,000.

A simplified version of the query looks like this:

select t.field1, t.field2 
from table1 t 
where t.id in (select top 1000 t2.id from table2 t2 where t2.id > XXXX order by t2.id)

Any idea why it might be suddenly taking so long to execute?

Uwe Keim
  • 2,082
Kevin
  • 139

2 Answers2

7

This is a school example of obsolete statistics - a query runs well and then, without any obvious reasons, becomes very slow.

SQL Server statistics become inaccurate when databases are in use and many transactions occur.

The problem troubleshooting starts with analyzing the slow query. If the difference between the estimated and actual number of rows in a query execution plan is higher than 10%, the statistics are obsolete. How the performance will be affected depends on the query and execution plan. The same obsolete statistics can have different effect on two different queries.

There is no dynamic management view that can indicate inaccurate statistics and sometimes it's easier to update statistics and see if that helps, than to analyze the situation step by step.

You can find more info about statistics here: Statistics

Uwe Keim
  • 2,082
1

All that was needed was to update statistics.

Uwe Keim
  • 2,082
Kevin
  • 139