I'm trying to understand a piece of code that is using a camera projection matrix. The camera projection matrix $P = \left[\begin{array}{c}p_1^T \\ p_2^T \\ p_3^T\end{array}\right]$ takes a point $X$ in 3D world coordinates and projects it onto pixel coordinates $x$ of an image. It is a $3 \times 4$ matrix. The 3rd row $p_3^T$ is the optical axis of the camera (neglecting the 4th element). The 1st and 2nd rows also correspond to the x and y axes if I normalize them.
In the piece of code I am working with, the norm $\|p_1^T\| + \|p_2^T\|$ is calculated. Any idea what this quantity could represent? It is called a "scale" factor.