I have a rather silly question, so please excuse me=)
Let's say I have a general Mathematica code, where I numerically evaluate and combine several quantities. I would like to know how to determine the number of the significant digits of my final result. Does it depend on the significant digits of my input numbers, or is it limited by the code itself?
Cheers, ikarus
Precision. I believe the number of significant digit is controlled by the environment variable$WorkingPrecision. Maybe some expert will confirm/infirm. But certainly not by the digits of the input numbers. – anderstood Sep 20 '20 at 12:59Block[{f1, f2, x2, y}, {f1, f2, x2, y} = SetPrecision[{1/2, 1/3, 1, Pi - 1*^-8}, 100]; 100 - Precision[int]]and the following code block. The expressioninthas aPrecisionofInfinity, and the precision needs to be at least 100. Then the code returns the number of digits of precision lost. You can set the precision of your input numbers (including constant parameters in your code) to the number of sig. figs. and do the calculation.Precision[result]will tell you the number of sig. figs. – Michael E2 Sep 20 '20 at 13:47