Mysterious behavior of Precision:
{{1.0+I*0.0},{0.0+I*0.0}} // SetPrecision[#,30]& // Precision // Print;
0.
{{1.0},{0.0}} // SetPrecision[#,30]& // Precision // Print;
30.
Why is the precision zero in the first instance, but not the second?
This led to some tough-to-diagnose program behaviors!
0.0+I*0.0is the culprit. TryI // SetPrecision[#, 30] & // Precision,1 // SetPrecision[#, 30] & // Precision,0 // SetPrecision[#, 30] & // Precision,0.0 // SetPrecision[#, 30] & // Precision,0.0 I // SetPrecision[#, 30] & // Precisionand especially pay attention to the last two. I assume thatPrecisionwhen applied to an array takes the minimum of the precisions of the elements of the array; the first element of{{1.0+I*0.0},{0.0+I*0.0}}has precision 30, whereas the second has precision 0, so the result is 0, but I'm not sure why0.0Ihas precision 0. – DumpsterDoofus Feb 11 '14 at 00:35