The problem reads like this:
Prove that if $gcd(a, b) = 1$, then $gcd(a + b, a - b)$ equals to 1 or 2.
Solution: From the linear combinations
$(a + b) * 1 + (a - b) * 1 = 2a$
$(a + b) * 1 + (a - b) * (-1) = 2b$
we know that $gcd(a + b, a - b)$ divides both $2a$ and $2b$. Since $gcd(a, b) = 1$, we conclude that $gcd(a + b, a - b)$ divides 2. Consequently $gcd(a + b, a - b)$ is either 1 or 2.
I tried to follow this proof, but I really got stuck at the From the linear combinations ... we know that ... part. Where do these linear combinations come from? Especially the $(a + b) * 1 + (a - b) * (-1) = 2b$ part is hard to understand, where is the $-1$ coming from?
I'm familiar with Euclids algorithm, but don't understand how the linear combinations above fit into the picture.