I don't fully understand how data types are converted when their sockets are connected. Can someone help me fill in the blanks or better yet, give us a proper diagram about how this happens?
Data Type Definitions
Boolean. True or false value.
Boolean In-depth: A value of either 1 or 0 with 1 = True and 0 = False.
Integer. 32-bit integer.
Integer In-depth: -
Float. Floating-point value.
Float In-depth: -
Vector. 3D vector with floating-point values.
Vector In-depth: -
Data Type Conversions
Integer $\rightarrow$ Boolean. Given integer n, if n $\leq$ 0, then return 0 = False. Otherwise, return 1 = True when n $\geq$ 1.
Float $\rightarrow$ Boolean. Given float n, if n $\leq$ 0, then return 0 = False. Otherwise, return 1 = True when n $\geq$ 0.
Vector $\rightarrow$ Boolean. If vector is (0,0,0), then boolean returns 0 = False. For every other vector, return 1 = True. (Why can't I get a 0 = False value with a vector other than (0,0,0)?)
Float $\rightarrow$ Integer. Truncate.
Vector → Float. Mean. Average of scalars within vector represented as a float.
Vector $\rightarrow$ Integer. Mean and Truncate. Average of scalars within vector represented as float. Then truncated as an integer. (Not too sure if the scalars within the vector are truncated before calculating the average or after.)
Boolean $\rightarrow$ Vector. If boolean 0 = False, then return vector (0,0,0). If boolean 1 = True, then return vector (1,1,1).
Integer $\rightarrow$ Vector. Given integer n, then return vector (n,n,n).
Float $\rightarrow$ Vector. Given float n, then return vector (n,n,n).
