-1

Prove that for every natural numbers $a$ and $b$, there are infinitely many numbers $s$, such $\gcd(a+s,b+s)=1$ and $a\neq b$

I tried to use Bezout's theorem but I can't get to the result

  • no it isn t a fixed number we can change it , it depends to the value of and b –  Sep 11 '18 at 12:28
  • @AbderrahmaneDriouch That is not what your question says. In your question, $s$ cannot depend on $a$ or $b$. – Batominovski Sep 11 '18 at 12:29
  • This question does not make sense, please edit for clarity. – lulu Sep 11 '18 at 12:30
  • but i said "there are an infinity of numbers $s$ –  Sep 11 '18 at 12:30
  • @AbderrahmaneDriouch As stated, Saucy O'Path is correct. There is no such number $s$. – Batominovski Sep 11 '18 at 12:30
  • so prove it.... –  Sep 11 '18 at 12:32
  • If $d|a+s, d|b+s,d$ must divide $a+s-(b+s)$ Now if $(a,b)=D, d|D$ – lab bhattacharjee Sep 11 '18 at 12:32
  • 1
    Voting to close the question as it is unclear what you are asking. If you can, please edit your post for clarity. – lulu Sep 11 '18 at 12:33
  • 2
    I edited the question - is this what you wanted to ask? And don't demand solutions from people, no one here is obligated to solve YOUR problems for YOU. – asdf Sep 11 '18 at 12:33
  • ok ,alright..... –  Sep 11 '18 at 12:34
  • The Question has been edited into a form where it can be answered (and has been) by reasoned mathematical argument, namely that as stated the claim is false but with the additional assumption that $a\neq b$, it can be proven. I'm therefore inclined to leave open. – hardmath Sep 11 '18 at 15:02

2 Answers2

3

As rightfully pointed out in the other answer, the statement trivially does not hold when $a = b$. Thus we must make the assumption that $a \neq b$. Then, my original answer holds:

Without loss of generality, $a > b$. A property of the greatest common divisor tells us that $$ \gcd(a + s, b + s) = \gcd((a + s) - (b + s), b + s) = \gcd(a - b, b + s). $$ Hence, if $p$ is any prime that does not divide $a - b$, we can choose $s = p - b$, and then $$ \gcd(a + s, b + s) = \gcd(a - b, p) = 1. $$ Since there are infinitely many primes, this shows that there are infinitely many such $s$.

Mees de Vries
  • 26,947
1

As stated the claim is obviously false: $a$ and $b$ may be equal, in which case the GCD is $|a+s|$ (assuming you meant integer $s$, otherwise the problem doesn't make sense), and that is equal to $1$ for exactly two values of $s$.

So you must also have the assumption that $a \ne b$. Let's assume $a < b$.

There are infinitely many primes greater than $b$. For any such prime $p$, let $s = p - b$. Then $\gcd(a+s,b+s)=\gcd(a+p-b, p) = 1$, because $p$ is prime and $ 1 < a+p-b < p$

  • thanks......... –  Sep 11 '18 at 13:28
  • Thank you for the correct addition; I hope you don't mind that I've edited it into my answer. – Mees de Vries Sep 11 '18 at 13:29
  • 1
    @MeesdeVries - Of course, especially since you also specifically referenced the other answer, you didn't simply change yours. That is the proper way to deal with such situations IMO. I wrote my answer and posted it without seeing yours. Then I saw yours and noticed it is very similar; but I left mine up, since I don't see the need to go through $a-b$. My thought was to go directly to $b+s$ being a prime, so the GCD should automatically be $1$, with essentially no other justification needed. –  Sep 11 '18 at 13:34