Will a new index improve the overall performance of your database server? Will adding another index slow down your insert or update performance? Can adding another file to tempdb make things run quicker?
The answer to all of those questions is definitively “it depends.” That sounds contradictory, but it’s actually the answer that applies to all of your workloads. Your workload, on your version of SQL Server, on your hardware, with your data sets, will perform differently than the same version of SQL Server on the same hardware than my workload.
What does that mean for DBAs and developers? It really means that you can’t easily make hard and fast decisions about many of the settings, configurations, and designs you use for performance are just guidelines. The best DBAs I know have lots of guidelines and rules of thumb that they use to begin their design or analysis of a database instance. However they are willing to make exceptions or changes to their “rules” if the situation warrants a change.
How do they know when to change? They test.
It’s that simple. Running tests, comparing the changes in performance, measuring the metrics that occur from implementing a particular feature allows a DBA to make an educated guess about whether or not any alteration of your code, your indexes, or your design will bring about improvement. However testing before you make the chance isn’t enough. Even after you have made a change, you need to continue to monitor, essentially “re-testing” in a live system, to determine if the system is still performing well.
Just trusting that you are “sure” of what to do or not to do is a mistake. Set up tests, gather metrics, and make informed decisions to ensure you servers are operating at peak performance.