|P is for Practical|
When (if?) that happens, then a slower and more complex solution would be required.
Even then, the simple expedient of splitting the task into two passes would double the life of the solution and would still work out faster and simpler than moving the processing into an RDBMS.
But given the spread of the OPs stated range--10k to 10m--it seems likely that 10 million is an extreme, future-proofing upper bound already.
The OP (presumably) knows his problem space. For us to make assumptions to the contrary and offer more complex solutions on the basis that it might allow for some unpredicted future growth, would be somewhat patronising.
And given the simplicity of the solution, it's not as if he would have to throw away some huge amount of development effort if he did reach it's limits at some point in the future.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.