6

I set up a basic scene with a fisheye camera aimed at a sphere, with another sphere vertically above the camera, and another to the right (all three at the same distance):

enter image description here

(Note: Need that top pulldown on "Cycles Render")

Result is:

enter image description here

Now, I'm curious why the fisheye doesn't draw all three the same size. I was hoping it would fire rays out with an equal angle-gap.

Imagine we have a 1000x1000 image, and we are just looking at the 1000 of pixels along the centre-line, so (0,500) to (1000,500)

So I was hoping:

AngleBetween(pixel[1]-ray, pixel[2]-ray) == AngleBetween(pixel[501]-ray, pixel[502]-ray)

However, clearly this is not the case.

Why are the objects on the edge of the screen getting magnify-warped?

Is it possible to know what Blender is doing mathematically, without digging through the source code? (as far as I know, FishEye is a generic term for this type of warping, it is not an exactly defined projection...)

Also, is there any way to prevent this?

LINK: Spherical / Custom projection at code level

P i
  • 3,921
  • 9
  • 34
  • 53

1 Answers1

2

The problem is that you're trying to map a hemisphere (the view) onto a plane (the rendered image). There is always going to be distortion.

But you're right, you might be able to choose what distortion you get. Have a look at the various projections used in geospatial software. If you want equal angles, perhaps one of the equal area projections would suit you - although I suspect Blender's fisheye lens is already some kind of equal area projection.

As for implementing a different projection, you might be able to do it with an OSL shader, by placing a lens object in front of the camera.

z0r
  • 877
  • 4
  • 14
  • +1 for a solution path that doesn't involve recompiling the entire Blender source tree :) – P i Apr 08 '14 at 23:15