I set up a basic scene with a fisheye camera aimed at a sphere, with another sphere vertically above the camera, and another to the right (all three at the same distance):

(Note: Need that top pulldown on "Cycles Render")
Result is:

Now, I'm curious why the fisheye doesn't draw all three the same size. I was hoping it would fire rays out with an equal angle-gap.
Imagine we have a 1000x1000 image, and we are just looking at the 1000 of pixels along the centre-line, so (0,500) to (1000,500)
So I was hoping:
AngleBetween(pixel[1]-ray, pixel[2]-ray) == AngleBetween(pixel[501]-ray, pixel[502]-ray)
However, clearly this is not the case.
Why are the objects on the edge of the screen getting magnify-warped?
Is it possible to know what Blender is doing mathematically, without digging through the source code? (as far as I know, FishEye is a generic term for this type of warping, it is not an exactly defined projection...)
Also, is there any way to prevent this?