If, on the other hand, you define a prime number as "a positive integer which has no divisors other than 1 and itself", then 1 must be regarded as a prime number. There is nothing in the logical meaning of "and" which excludes that "1" and "itself" can be equal!
Which is the "correct" definition? Both have been used over the years, and it really depends on how you want to develop the discussion of prime numbers. Mathematicians are nowadays in general agreement that it is more convenient to use a definition which excludes 1 from the set of prime numbers.
The prime reason (if you will pardon the pun) is that one of the first results in number theory is The Fundamental Theorem of Arithmetic, which states:
Every positive integer greater than 1 is expressible as the product of prime numbers, and, except for the order of the factors in the product, such an expression is unique.Thus 1001 = 7 x 11 x 13, and there is no other way of expressing 1001 as a product of prime numbers, except by changing the order of the factors. However, if you allow your definition of prime to include the number 1, you could write 1001 in many different ways:
Every positive integer greater than 1 can be expressed as a product of prime numbers greater than 1, and, except for the order of factors in the product, such an expression is unique.Thus, if you allow 1 to be a prime number, you must immediately exclude it from your first major theorem about prime numbers. That is enough justification for agreeing not to consider 1 to be a prime. That is, mathematicians make sure that they define a prime number is such a way to ensure that 1 cannot be taken as prime! It is a human decision to exclude 1 from the primes! Source: Mathematical Digest, University of Cape Town