But then, gcc should also be able to optimize the eight unused functions away. Does it optimize in that way? And will it still optimize that way when we update to a gcc version three major version numbers away from now?
Functions declared with static have "internal linkage". When unused (have no direct calls nor reference taken), the compiler can eliminate those functions as dead code. Depending on this optimization is quite reasonable.
I'd wholeheartedly suggest exploring the effect of different options on generated code and looking at the assembly in general. Gcc with -O1 or -Os is usually okay, except gcc -Os does not optimize constant divisions via muls (e.g. x = x / 10).
Keeping a basic tally of object sizes is also advisable. I've used makefile rules to objdump -t foo.o > foo.syms and a simple symcmp.pl to report a sorted diff between object directories so that any big regressions with a new compiler will stick out in the noise.