Sql Server 2008 varbinary(max) performance bug
Apparently, the mere presence of varbinary(max) columns in your table will drastically affect performance, even if you are NOT using the column at all in your query! \
Table A contains Id (int), Name (varchar 100) and also a varbinary(max) NOT NULL and a varbinary(max) NULL.
Table Table B contains Id (int), Name (varchar 100) but not the 2 varbinary(max) columns.
Select Id, Name From A joined to several other tables -> takes 40 seconds (100000 rows)
Select Id, Name From B joined to several other tables -> takes 1 second (100000 rows).
So the only workaround I have found so far is to move the varbinary columns to a joined table. This adds overhead and complication to the system, though. I think the other workaround is to use the table options to store the data outside the table, but this is sufficiently complex that I have been unable to find any simple tutorials online on best practices.
Note that if you do not join to several other tables, the performance is somewhat better (but still not great); it seems the presence of several joins causes the query to start scanning the varbinary columns even though they are not in the query. Probably it is picking up the entire table as some sort of 'optimization' as it does the other joins, even though they only use the Id and Name fields.
I even tried creating an index on the table that included every column except the varbinary ones, but that didn't help.
If you don't do ANY joins, I still found that a query of other (non-binary) columns in a table that happens to contain binary columns is 1200 times slower than a query of a similarly sized table that does NOT contain binary columns:
Select top 100000 Id, Name From A -> 12 seconds
Select top 100000 Id, Name From B -> 1 second
Created By: amos 1/8/2015 12:41:56 PM
Updated: 1/12/2015 12:10:37 PM