9

Outer is slowing down considerably when the first argument is Abs[#1 - #2]/Max[#1, #2]&

Outer[List, Range[5000], Range[5000]]; // AbsoluteTiming
(* {0.120091, Null} *)

Outer[Abs[#1 - #2]/Max[#1, #2]&, Range[5000], Range[5000]]; // AbsoluteTiming
(* ~ 59 seconds *)

Is there a faster way to perform the above computation? In the real problem I have two lists that are different and not Range[5000] i.e. the lists can be of different lengths.

Example with two small lists:

list1 = {6576.13, 7504.5, 6964., 7645.63, 5297.5, 6897.75, 4944.13, 8184.13, 
3426.75, 8722.75, 3683.63, 15344.4, 7026.25, 5677.63, 6872.88,
6050.5, 7948.63, 5095.13, 6335.25, 6024.25, 6508.88, 6961.63, 
8262.13, 4560.38, 7113.75, 7011., 9070.13, 5625.88, 7801., 6855.38, 
5973.25, 6164.75, 6115.75, 3886.13, 11967.4, 6606.13, 6223.5,
5576.38, 7855.88, 5616.38, 5946.88, 4750.25, 6162.25, 6539.88, 
5563.75, 7723.63, 6241.5, 3794.13, 6854.88, 8154., 4241.};

(* length is 50 *)

list2 = {3762.13, 7272.75, 7923.25, 7882.38, 4407., 7110., 6468.5, 7565.88, 
3117.25, 15918.8, 3753.5, 8801.25, 7120.13, 6643.63, 6565.5, 7537., 
6081.5, 6948.88, 7468.63, 6736., 7091.75, 7980., 5143.38, 7540.88, 
6754., 5746.13, 9075.63, 4536.5, 7873.75, 7106.5, 5127., 3809.25, 
5274.5, 6760.25, 7031.25, 7158.38, 7484.25, 5753.25, 6105.38, 
7084.63, 4866.88, 5690., 7179., 5572.88, 6209.13, 7820.25, 5432.5, 
2281.63, 3917.13, 4050.75};
(* length is 51 *)

Outer[Abs[#1 - #2]/Max[#1, #2]&,list1,list2];

This question is not a duplicate since it involves lists of different lengths.

J. M.'s missing motivation
  • 124,525
  • 11
  • 401
  • 574
Ali Hashmi
  • 8,950
  • 4
  • 22
  • 42

5 Answers5

11

This should be equivalent:

l = RandomReal[{-2, 2}, 5000];
r = RandomReal[{-2, 2}, 5000];

Function[left, Abs[left - r]/Clip[r, {left, ∞}]] /@ 
   l; // AbsoluteTiming

0.32

(Note that this is much faster for packed arrays of machine precision numbers instead of integers and fractions)

Niki Estner
  • 36,101
  • 3
  • 92
  • 152
6

You could try:

d[l1_, l2_] := With[{d = Outer[Plus, -l1, l2]}, Abs[d]/(Ramp[d] + l1)]

With your sample data:

r1 = d[list1, list2];
r2 = Outer[Abs[#1-#2]/Max[#1,#2]&, list1, list2];

r1===r2

True

For some large lists:

l1 = RandomReal[1, 5000];
l2 = RandomReal[1, 5001];

d[l1, l2]; //AbsoluteTiming
Outer[List, l1, l2]; //AbsoluteTiming

{0.483663, Null}

{0.141301, Null}

So, not too much slower than your reference Outer example.

Carl Woll
  • 130,679
  • 6
  • 243
  • 355
5

This seems fairly snappy:

fn = With[{tup = Tuples[{#1, #2}]}, 
          Partition[Abs[Subtract @@ Transpose@tup]/(Max /@ tup), Length@#2]] &;

Use:

result= fn[list1, list2]
ciao
  • 25,774
  • 2
  • 58
  • 139
3

so far i have found that Compile does a pretty neat job in speeding up the computation

func = Compile[{{list1, _Integer, 1}, {list2, _Integer, 1}},
Outer[Abs[#1 - #2]/Max[#1, #2] &, list1, list2], 
CompilationTarget -> "C"];


func[Range@5000, Range@5000]; // AbsoluteTiming
(* {0.61639, Null} *)
Ali Hashmi
  • 8,950
  • 4
  • 22
  • 42
2

On my computer this is 6 times faster:

L1 = RandomReal[{-2, 2}, 5000];
L2 = RandomReal[{-2, 2}, 5000];
Map[Abs[L2 - #] &, L1]/Outer[Max, L1, L2]

The following is 3.6 times faster than the code above

sz = Length[L2] + 1;
Map[(u[[Most[#[[;; First[FirstPosition[#, sz]]]]] &[
  Ordering[Append[u = L2, #]]]]] = #; Abs[L2 - #]/u) &, L1]

And in both cases the Length of L1 should be less than that of L2

Coolwater
  • 20,257
  • 3
  • 35
  • 64
  • actually during computation we do not know whether L1 is smaller than L2 – Ali Hashmi Jul 20 '17 at 14:06
  • on the other hand you can generalize this using an If statement to assign the bigger list to one variable and smaller to another to maintain consistency. – Ali Hashmi Jul 20 '17 at 14:10
  • you might want to check the efficiency on say L1 = L2 = Range[5000]. it gives 30 seconds and 21 seconds respectively on my PC. – Ali Hashmi Jul 20 '17 at 15:06