That said, that time for your cut seems almost to good to be true. You are sure that cut can't somehow detect that it is writing to the null device and simply skip it--like perl sort detects a null context and skips?
As it happens, in the very first run of cut I tried, I sent the output to a file; and yes, I was pleasantly surprised to see how fast this cut was. But, what utility could there be for the optimization you describe? If there is one, I sure can't think of it. And why should a no-op take 0.9s?
% time cut -d, -f"1-15" numbers.csv > out.csv
0.80s user 0.11s system 100% cpu 0.906 total
% wc out.csv
500000 500000 29174488 out.csv
% head -1 out.csv
% tail -1 out.csv