0

This is surely a simple question for many. Suppose a sum S of linked variables must be simulated:

  • x: independent variable sampled from empirical pdf.
  • a = f1(x); f1 is a function with uncertainty bounds, and so, a is also sampled.
  • b = f2(a); f2 is a function with uncertainty bounds, and so, b is also sampled.

S = a + b + c ...

Suppose that a, b, c each are sampled with 1000 realizations (list of 1 x 1000).

for i <- 1 to 1000
a
for j <- 1 to 1000
b
for k <- 1 to 1000
c

S < - a + b + c

If the sum is organized this way, as a "nested" sampling scheme, the size of S grows quickly at 1000^n. This is the least efficient way to do it. Which better way of sampling workflow could be implemented?

This is a general question

  • What is a "function with uncertainty bounds"? – user619894 Dec 05 '23 at 14:59
  • it's an empirical function, a cloud of points, relating two variables – Oliver Amundsen Dec 05 '23 at 15:07
  • Why not do a single loop? for i<- 1 to 1000; a,b,c = sampler(); S(i) = a+b+c. I don't see a reason why one must compute the value of $S$ for every possible combination of samples of $a$, $b$, and $c$ instead of just computing $N$ samples of $S$ directly. – whpowell96 Dec 05 '23 at 15:37

0 Answers0