Tuesday, February 1, 2011

Using session variables to reduce the number of database connections

My .NET website, which accesses a SQL Server 2005 database, has been performing poorly when usage is heavy. When processing more than about 5 - 10 orders per minute, CPU utilization on the database server approaches 100%, the website's response time becomes slow, and database timeout errors become frequent. I would have expected the website to handle hundreds of orders per minute.

The main problem seems to be that the website makes many database calls order, so many that a large number of database connections are open at any one time, straining CPU and RAM resources on the database server. This is a consequence of how the previous programmers built the web site. For example, a restaurant record can be looked up in the database by calling a stored procedure that accepts the restaurant ID as a parameter. The website was calling this stored procedure multiple times per page, even though the restaurant information never changes. (Okay, it does change sometimes, but typically only once every few months, and in ways that don't need to be taken into account during an in-progress order.)

Even caching the restaurant record in a local variable just for the duration of each page would help, but I went one step further and cached it in a .NET session variable, so that other pages can access the record without going back to the database.

In simplest terms, the pseudocode is like this:
if (session var exists) {
read restaurant record from session var
}
else {
read restaurant record from database
store restaurant record in session var
}

In practice, it was a little trickier, for two reasons. First, it's possible for a customer to place an order from one restaurant, and then without closing the browser, place an order from another restaurant. So the pseudocode really needs to be:
doLookup = true
if (session var exists) {
read restaurant record from session var
if (requested restaurant ID = restaurant ID from session var) {
doLookup = false
}
}
if (doLookup) {
read restaurant record from database
store restaurant record, including restaurant ID, in session var
}

Second, in .NET, session variables aren't accessible during very early parts of the page lifecycle. For example, in my website, a restaurant lookup can occur in the Application_PreRequestHandlerExecute handler. When writing code that uses session, and which may be called at any point during the page lifecycle, it's necessary to check whether the .NET session object is available before attempting to access it; otherwise an exception occurs. This is as simple as: if (HttpContext.Current.Session != null).

Now, the above isn't the whole story of improving my website's performance, for two reasons. First, it's not just restaurants that were being looked up in the database repetitively. The same problem applied to lots of other entities. I applied the same solution to as many of those entities as I could in the time available.

Second, by using SQL Server's SQL Profiler and Database Engine Tuning Adviser, I determined that the creation of some indexes and statistics could likely improve performance -- according to the Database Enginer Tuning Adviser, by as much as 85%. (Thanks to my friend Max Fomitchev for his excellent assistance with these SQL Server performance tools.)

So I've implemented the session variable stuff, plus the indexes and statistics. It will all go live later this week. I'll post a comment once I know the results.

1 comment:

  1. The programming changes that reduced the number of database calls helped somewhat. After I implemented them, CPU and RAM utilization improved, but not dramatically. Not really surprising; the time available to make programming changes was limited, so we completed some, but not as much as I would have liked. The number of database calls per order processed, while considerably reduced, is still high.

    The next day, I implemented the indexes and statistics recommended by SQL Server Database Engine Tuning Advisor. This, in combination with the programming changes, did have a dramatic and positive effect. Prior to these changes, when processing 10 orders per minute, the website would respond slowly, CPU utilization on the web server would be between 80 and 100%, and the number of open connections shown in the SQL Server Activity Monitor sometimes approached 200. With the changes, we got through one of our busiest days ever, handling up to 17 orders per minute, with no problems. Response time remained fast, CPU utilization was mostly under 50 - 60%, and the number of connections was typically 70 or fewer.

    ReplyDelete