1

A colleague recently brought up this argument when we were talking about big-O runtime analysis and I've been unable to find why it is incorrect:

Informally, the argument goes like this:

"If the runtime of all logarithms of any base are supposed to be equivalent, there is nothing that keeps us from saying $O(\log x) \equiv O(\log_x x) \equiv O(1)$, so either the upper bound of logarithmic growth is constant (which it evidently is not) or all logarithms are, after all, not equal in runtime."

We recalled the reasoning behind the accepted $ O(\log_a x) \equiv O(\log_b x)$ from the lemma that allows us to calculate logarithms of arbitrary base, i.e.

$$ \log_a x = \frac{\log_b x}{\log_b a} $$ where, since $a,b$ are constant also $\frac{1}{\log_b a}$ is constant, and hence $ O(\log_a x) \equiv O(\log_b x)$ holds.

I argued that since $x$ is decidedly not a constant, his conjecture would also be false as no constant factor could be extracted from the lemma.

The counterargument quickly followed, stating that since $ O(\log_a x) \equiv O(\log_b x)$ holds $\forall a,b \in \mathbb{R^+}$ it must not only hold in $a =x$ or $b = x$, but also in $a=1$ and $b=1$ and consequently $ O(\log x) \equiv O(\log_1 x) = O(0) \equiv O(1)$ breaking the original lemma not only for the base $x$ but also base $1$, where $1$ is about as constant as it gets. <- beer induced math fail

At that point in conceded that I could not find a flaw in the argumentation and as such I am turning to you: What's going on here and where are we wrong?

nitowa
  • 127
  • 2
  • 2
    Was your friend invented for the sake of a homework exercise? – Yuval Filmus Oct 08 '21 at 16:48
  • $\log_1 x$ is not defined: if you think it is, then $\log_1 x = \frac{\log_b x}{\log_b 1} = \frac{\log_b x}{0}$. – Nathaniel Oct 08 '21 at 16:49
  • No we're both already past the stages of our lives where we do homework. Also thanks Nathaniel you are indeed correct there, I guess we both confused $log_1 x$ with $log 1$. Also for reference this argument took place after some drinks! – nitowa Oct 08 '21 at 16:55

2 Answers2

2

Let me emphasize some flaws in argumentation:

1.

$O(\log x) \equiv O(\log_x x)$

is false because on left hand base of logarithm is constant, but on right hand not.

2.

$ O(\log_a x) \equiv O(\log_b x)$ holds $\forall a,b \in \mathbb{R^+}$

is false, because there should be additionally $a,b > 1.$

zkutch
  • 2,364
  • 1
  • 7
  • 14
1

Let's generalize the argument. It is well-known that $O(f(n)/C) = O(f(n))$. Now let $f(n)$ and $g(n)$ be arbitrary functions. Then $$ O(f(n)) = O(f(n)/f(n)) = O(1) = O(g(n)/g(n)) = O(g(n)). $$ What goes wrong here? We only have $O(f(n)/C) = O(f(n))$ for constant $C > 0$.

Similarly, we only have $O(\log_a n) = O(\log_b n)$ for constant $a,b>1$.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503