I attended a presentation recently from Steven Wright of SQL Sentry on Analysis Services (SSAS) memory management and it was really interesting. I haven’t done much work at all with SSAS, mostly goofing around with basic cube setups, so I haven’t really had to administer a production instance. And I haven’t run into memory issues, which look like they could be a regular part of your day.
SSAS manages memory much differently from SQL Server, and there’s a lot to learn, but there was one amazing fact to me about the settings. There are two limit settings, the Low Memory Liimit and the Total Memory Limit, both of which determine how aggressively SSAS starts to try and clear out objects from memory to free it up. These are settings you make on the server, either in the properties or an .ini file (what is this SSAS 3.1?) and the defaults are 75 for the low limit and 80 for the total.
These are documented in Books Online, and it says this is the percentage of total physical memory.What is doesn’t say, however, is this: If you enter a number greater than 100, the value is interpreted as the “bytes” of memory to be used. That’s bytes with a little “b”, not kb, not MB, but b. It’s documented here by Greg Gonzalez.
You’d think that would be something they want to call out in the documentation. A few people have been bitten by this thinking it was kb or MB, since most counters are expressed in one of those scales.