Randomize two random integers between 1 and 100 (including 1 and 100) and pick the lowest of the two, what is the expected value of this integer? I believed it is (1-1/sqrt(2))*100 since the chance of getting 100 would be 1/100 * 1/100 = 1/10,000 and chance of getting 99 is 3/10,000 since there are three possible ways this could happen ([99,99],[99,100],[100,99]). 98: ([98,98],[98,99],[98,100],[99,98],[100,98]) 5/10,000 and so on. Thus the odds of getting an integer in the range [100,(100-n)] should be (n+1)^2/10,000. To solve for average integer: (n+1)^2/10,000 = 1/2 => n =sqrt(5,000)-1 ≈ 69.71 therefore the expected integer should be 30. However when I try this with a simple program i get something around 33.8.
What is wrong here?