You claim that 1 is not a prime by definition, and *This is a useful definition since it allows integers to have a unique prime factorization.* But note:
`6 = 2 * 3 = 3 * 2 = (-2) * (-3) = (-3) * (-2)
`
Therefore the uniqueness of prime factorization is only uniqueness up to some defined equivalence relation of factorizations. There is no reason we can't define this equivalence so that stray factors of +-1 don't matter.
Indeed the standard definition most people know of a prime is something like, *A positive integer is prime if it is only divisible by 1 and itself.* Arguing about whether 1 is prime is a question of arguing about whether those two statements need to be separate. Which is a pretty minor difference.
As I say I point this out because when I was taking upper level number theory I would run across articles which made comments like, "X's counts are off by one from Y's because X counted 1 as prime." When I first saw this I was flabbergasted. Then I was told that everyone knew it didn't really matter one way or the other, so some people who were trying to produce long lists of primes counted 1 just as an easy way of improving their count. (This isn't something that people have done much of recently because it is simpler to generate lists than to find somewhere to store it.)
So there isn't really a definite answer within the positive integers. The definition that 1 is not prime does generalize more nicely into abstract algebra (primes generate prime ideals which leave you with integral domains when you mod out by them). And most people use it most of the time. But that usage isn't universally set in stone in my experience.
You also point out that, *Less controversially, 0 is not prime...* which is absolutely true. I have never seen anyone claim that 0 is a prime. The closest that I have seen is things like having people who study the p-adic number fields refer to the reals as the 0-adics, and 0 as "the infinite prime". (Try factoring 0 to see where the "infinite" bit comes from.) |