2

$$(A \pm a) \times (B \pm b)=AB\left(1 \pm \frac aA \pm \frac bB\right)$$

Here is the formula im struggling with, I'm just trying to apply this formula for division (i.e for speed, $\frac dt$, instead of momentum $mv$). How would I apply this formula for when $B = \frac 1B$ ? I have discovered that if you were to change AB to $\frac AB$, and remain $a$ and $b$ as if you were to use the formula to multiply $AB$ instead of $\frac AB$, you'd arrive at the correct solution. But, wouldn't you have to use the uncertainty of $B$ in the form of $\frac 1B$ – which I don't know how to find ?

Klg
  • 155

4 Answers4

6

In both multiplication and division, it is the fractional uncertainties which add in quadrature. That is,

\begin{align} (A±a)×(B±b) &=AB×\left( 1±\frac aA \right) \left( 1±\frac bB \right) \end{align}

From the binomial theorem you can say

$$ \left( B+b \right)^{-n} ≈ B^{-n}\left( 1-n\frac bB \right) $$

So if for instance you have fractional uncertainty $b/B≈1\%$, the inverse $C=B^{-1}$ will also have fractional uncertainty $1\%$. This is also the origin of the rule for propagating uncertainties through other exponents.

rob
  • 89,569
  • 4
    +1. This is so important that it bears repeating: addition and subtraction propagate absolute errors, while multiplication and division propagate relative errors. – Greg Martin Sep 06 '22 at 03:02
  • Where did you get the binomial expression from? – Klg Sep 07 '22 at 05:46
  • @Yao From a professor in my sophomore year, who said, “this trick isn’t on our syllabus, but it’s impossibly useful.” See e.g. https://en.wikipedia.org/wiki/Binomial_approximation – rob Sep 07 '22 at 11:50
6

The binomial theorem, as rob explains, gives a powerful method for propagating uncertainties through any power of a quantity. However, if if the power is just –1, here's a simple trick... $$(B+b)(B-b) = B^2-b^2$$ But if $b\ll B$ then $b^2 \lll B^2$, so we neglect $b^2$ and divide through by $B^2$ giving $$\left(1+\tfrac bB\right)\left(1-\tfrac bB\right)=1$$ That is $$\left(1+\tfrac bB\right)^{-1}=\left(1-\tfrac bB\right)$$ So the fractional uncertainty in $B^{-1}$ is the same as that in $B$.

Philip Wood
  • 35,641
1

Just use the product rule for differentiation and assume that the errors are small to find the fractional change.

$\Delta (A\cdot B) \approx \Delta A \cdot B + A\cdot \Delta B \,\Rightarrow \,\dfrac{\Delta (A\cdot B)}{A\cdot B} \approx \dfrac{\Delta A}{A} + \dfrac{\Delta B}{B}$

You can also do the same method for the quotient $A/B$ to get the result $\dfrac{\Delta (A/B)}{A/B} \approx \dfrac{\Delta A}{A} + \dfrac{\Delta B}{B}$ and then to power relationships eg $\dfrac {\Delta A^n}{A^n}\approx n \dfrac {\Delta A}{A}$

Note that these are overestimates of the error and a better estimator of the error in both cases is $\sqrt{\left(\dfrac{\Delta A}{A}\right)^2 + \left(\dfrac{\Delta B}{B}\right )^2}$

Farcher
  • 95,680
  • A nice variation ... If $Q=\frac AB$ then $\ln Q =\ln A-\ln B$, so for small changes $$\frac{\Delta Q}Q=\frac{\Delta A}A-\frac{\Delta B}B$$ So, assuming the worst case of uncs in $A$ and $B$ being in opposite directions, we add the moduli of the fractional uncs. – Philip Wood Sep 07 '22 at 16:33
  • A@PhilipWood Nice but what about the minus sign? – Farcher Sep 07 '22 at 16:37
  • I posted prematurely (pressed 'return' to start a new line). 'Durrr', you might say. – Philip Wood Sep 07 '22 at 16:40
0

Let $f(X,Y)$ be a function of two random variables $X$ and $Y$. Let $X_m$ be the mean of $X$ and $S_X$ the standard deviation of $X$. Let $Y_m$ be the mean of $Y$ and $S_Y$ the standard deviation of $Y$. The mean of $f(X, Y)$ is $ f_m = f(X_m, Y_m)$. Assuming $X$ and $Y$ are independent, retaining the lower order terms in a series expansion of $f(X,Y)$ about the point $P = (X_m, Y_m)$, the variance of $f(X, Y)$, denoted as $S_f^2$ where $S_f$ is the standard deviation, is

$(1) \enspace S_f^2 = ({\partial f \over \partial X}|_P)^2\, S_X^2 + ({\partial f \over \partial Y}|_P)^2\, S_Y^2 + ({\partial ^2f \over \partial X \partial Y}|_P)^2 \,S_X^2\,S_Y^2$

If both $X_m$ and $Y_m$ are not zero, the variance can be approximated as

$(2) \enspace S_f^2 = ({\partial f \over \partial X}|_P)^2\, S_X^2 + ({\partial f \over \partial Y}|_P)^2\, S_Y^2$

It appears you want to estimate the uncertainty in $A/B$ for random variables $A$ and $B$ given uncertainties in $A$ and $B$. You have $A_m \pm a$ where $A_m$ is the mean of A and $a$ is the standard deviation of $A$, and $B_m \pm b$ where $B_m$ is the mean of B and $b$ is the standard deviation of $B$. For this case $B_M$ cannot be zero.

The mean for $A/B$ is $A_m/ B_m$. Assuming $A$ and $B$ are independent, you can use the above relationships to estimate the standard deviation for $f(A, B) = A/B$.

For example, suppose $A_m \pm a = 5.2 \pm 0.9$ and $B_m \pm b = 3.4 \pm 0.2$ The mean of $A/B$ is $5.2/3.4 = 1.53$. Using relationship (2), the standard deviation of $A/B$ is $(3) \enspace S_{A/B} =\sqrt{{1 \over B_M^{\,2}} a^2 + {A_M^{\,2} \over B_M^{\,4}} b^2} = 0.28$, where $S_{A/B}$ is the standard deviation of $A/B$. Relationship (3) can also be expressed as $(4) \enspace{S_{A/B}^2 \over (A/B)_M^2} = {1 \over A_M^{\,2}} a^2 + {1 \over B_M^{\,2}} b^2$ where $(A/B)_M$ is the mean of $A/B$. The result for $A/B$ is $1.53 \pm 0.28$.

If $A$ and $B$ are not independent, you must retain covariant terms in the series expansion. For details see Meyer, Data Analysis for Scientists and Engineers, or a similar textbook dealing with the propagation of uncertainty.

You need to determine if your means and standard deviations for $A$ and $B$ that you are using in $A_M \pm a$ and $B_m \pm b$ are estimates for the population, or are estimates for the means of the population generated from a series of random samples. In either case the relationships (1) and (2) are correct but the data can represent different concepts. For details see my response to Uncertainty in repetitive measurements on this exchange.

John Darby
  • 9,351