A colleague recently brought up this argument when we were talking about big-O runtime analysis and I've been unable to find why it is incorrect:
Informally, the argument goes like this:
"If the runtime of all logarithms of any base are supposed to be equivalent, there is nothing that keeps us from saying $O(\log x) \equiv O(\log_x x) \equiv O(1)$, so either the upper bound of logarithmic growth is constant (which it evidently is not) or all logarithms are, after all, not equal in runtime."
We recalled the reasoning behind the accepted $ O(\log_a x) \equiv O(\log_b x)$ from the lemma that allows us to calculate logarithms of arbitrary base, i.e.
$$ \log_a x = \frac{\log_b x}{\log_b a} $$ where, since $a,b$ are constant also $\frac{1}{\log_b a}$ is constant, and hence $ O(\log_a x) \equiv O(\log_b x)$ holds.
I argued that since $x$ is decidedly not a constant, his conjecture would also be false as no constant factor could be extracted from the lemma.
The counterargument quickly followed, stating that since $ O(\log_a x) \equiv O(\log_b x)$ holds $\forall a,b \in \mathbb{R^+}$ it must not only hold in $a =x$ or $b = x$, but also in $a=1$ and $b=1$ and consequently $ O(\log x) \equiv O(\log_1 x) = O(0) \equiv O(1)$ breaking the original lemma not only for the base $x$ but also base $1$, where $1$ is about as constant as it gets. <- beer induced math fail
At that point in conceded that I could not find a flaw in the argumentation and as such I am turning to you: What's going on here and where are we wrong?