2

From Wikipedia: https://en.wikipedia.org/wiki/Kolmogorov%E2%80%93Arnold_representation_theorem

In real analysis and approximation theory, the Kolmogorov–Arnold representation theorem (or superposition theorem) states that every multivariate continuous function can be represented as a superposition of continuous functions of one variable. It solved a more constrained, yet more general form of Hilbert's thirteenth problem.

The works of Andrey Kolmogorov and Vladimir Arnold established that if f is a multivariate continuous function, then f can be written as a finite composition of continuous functions of a single variable and the binary operation of addition. More specifically,

$ f(\mathbf {x} )=f(x_{1},\ldots ,x_{n})=\sum _{q=0}^{2n}\Phi _{q}\left(\sum _{p=1}^{n}\phi _{q,p}(x_{p})\right)$

There are proofs with specific constructions.

In a sense, they showed that the only true multivariate function is the sum, since every other function can be written using univariate functions and summing."

There is a specialization of this theorem that says that every symmetric function can be expressed in the form:

$ f(x) = f(x_{1},\ldots ,x_{n})= \rho( \sum_{m=0}^{n} \phi(x_m)) $

which is like the Kolmogorov-Arnold theorem with the $\lambda_m$'s dropped. I encountered the latter theorem in the machine-learning literature.

I'm a beginner in learning the Yoneda lemma, which says:

$ [ \mathcal{A}^{\mathrm{op}} , \mathbf{Set} ] (H_A, X) \cong X(A) $

It looks like $X$ plays the role of $\rho$ and $A$ plays the role of "sum" and the above theorem is an application of Yoneda lemma. But I have trouble figuring out the details. Or am I completely off? Thanks :)

------------------

PS: Perhaps it is easier to see a connection (if any) if we look at the specialized version on symmetric functions:

Cayley's theorem: Every group can be represented as a sub-group of a special group, namely the permutation group.

Symmetric Kolmogorov-Arnold: Every symmetric function can be represented as a sub-function of a special symmetric function, namely the sum.

The similarity is rather tantalizing to me... though it may just be superficial...

PS: There is a nice proof of Cayley's theorem using Yoneda lemma in Emily Riehl's book "Category Theory in Context", corollary 2.2.10.

There is a nice proof of the symmetric Kolmogorov-Arnold theorem in the paper "Deep Sets" [2017, Zaheer, Kottur, Ravanbhakhsh, Roczos, Salakhutdinov, Smola] in which the original space is embedded in a higher dimensional space via an "exponential map", which is shown to be a homeomorphism and thus invertible.

In another paper (that proves the same theorem): "PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation" [2017, Qi, Su, Mo, Guibas] the representing function is not sum but max, there is also an exponential map involved.

YKY
  • 508
  • 5
    I see no connection between the Kolmogorov-Arnold result and the Yoneda lemma, but maybe someone else can show one. – Andreas Blass Feb 01 '21 at 16:54
  • 6
    The Yoneda lemma is an essentially trivial general nonsense, while the Kolmogorov-Arnold result is a hard theorem. The latter is certainly not an application of the former. – abx Feb 01 '21 at 17:58
  • 1
    @abx: Cayley's theorem: Every group can be represented as a sub-group of a permutation group. Kolmogorov theorem (symmetric version): Every symmetric function can be represented as a function composed with a sum of functions. Look pretty similar, don't they? – YKY Feb 02 '21 at 04:16
  • 3
    https://mathoverflow.net/a/53228/25028 – Sam Hopkins Feb 05 '21 at 05:15

0 Answers0