Calculus courses need to distinguish whether you are going to just look at the numbers flying by, or try to add them. So they have settled on sequence for the first and series for the second.
But in plain English usage, the distinction is not sharply drawn and it doesn't matter much. You didn't even notice the first time, and nobody was confused. That is my criteria for saying, "It is fine as it is."
This might be a good time to point out that, contrary to what many pedantic people believe based on highschool text-books, mathematicians don't generally care whether you include 0 as a natural number. More precisely, they generally have an opinion, and about half of them think it belongs while the other half do not. But regardless, the phrase "Whole Number" is something you can throw in the mental trash, it isn't used by mathematicians.
IMHO words only matter as they relate to communication. I am not a person who cares much about "official definitions". Here is an example of why not. Were I to say that array access in Perl was O(n*n), I would have used the words correctly but how many people would that be a miscommunication for? Alternately if I say that hash access is not O(1), I am perfectly right but I will confuse more than I clarify. So even though I "know better", I abuse the terminology just like everyone else does...