The default semantics are more well-known than any non-default set of semantics. That is the nature of being the default.
I'm not arguing that the reader of code should blindly assume the default semantics apply in any given case, but rather that there is no cause to request that non-default semantics be applied in cases where the default semantics are more than adequate and, further, that most1 code out there seems to use the default semantics in most1 places. By consistently depending on non-default semantics, you may be more self-consistent, but you are less consistent with the rest of the world.
1 I suspect that "the vast majority" would be more accurate, but I have no proof that such is the case.