In many other places, speed may not be so much of an issue
Not only that, but what "speed" means can vary wildly. Performance is not a simple, single variable that can be tweaked at will. There are so many variables at play that any widely general rule is going to be wrong some time or another.
Take this specific thread as an example. This is a classic case of micro-optimization. Certainly, using constant may be way, way faster than using Readonly. But what if the majority of an application's time is spent on network IO? Does the time it takes to resolve a constant really matter? It takes careful consideration and testing in the proper context to know for sure.
To be clear, I'm not poo-pooing optimization by any means. I'm just trying to point out that it's important to optimize the right thing.