15

If one looks at the "summation proofs" of divergent series such as Grandi's series, one might see a pattern that most of the computation rely on linearity and comparability with the shift operator of summation. These, of course, are not real proofs, since the series do not converge, but one might try to generalize the concept of summation to such series. $\DeclareMathOperator{\shft}{sh}$

Thus, inspired by Lebesgue measure on the reals, one might define a summability space as a pair $(\mathcal{S}, \sigma)$ such that:

  1. $\mathcal{S} \subset \mathbb{R}^\mathbb{N}$ is a vector subspace which contains the space $\mathcal{C}$, of real sequences whose sum converges, and is closed under $\shft$.
  2. $\sigma \colon \mathcal{S} \to \mathbb{R}$ is a linear operator.
  3. Regular: For every $(a_n)_n \in \mathcal{C}$ we have that $\sigma ( (a_n)_n ) =\Sigma_n a_n=a_1+a_2+\cdots$.
  4. Translative: For every $(a_n)_n \in \mathcal{S}$ we have that $\sigma((a_n)_n)=a_1 +\sigma(\shft((a_n)_n))$.

Here $\shft$ is the shift operator, i.e., $\shft(a_1, a_2, \dots)=(a_2,a_3,\dots)$.

What is the largest possible summability space (which is nice is some way)? Has this idea already been studied? Can we strengthen the definition of a summability space to get a canonical largest summability space (in the same way that Lebesgue measure is the "largest nice" measure on $\mathbb{R}$)?

I know that there are many ways to sum divergent series, such as Cesaro summation, Abel summation, etc. But I am asking about the "best" summation method, which "unifies" all other summation method. (Note that we can't simply apply the Hahn-Banach theorem or such a result since we would like to preserve translativity as well).

  • 2
    Why not just Zorn's lemma? I see no problem with preserving the "translativity". So, there is no one largest summability space. – Iosif Pinelis Oct 13 '22 at 13:11
  • 1
    Also, I think you should refer to summable sequences rather than converging ones. – Iosif Pinelis Oct 13 '22 at 13:13
  • @IosifPinelis That is true, we can use Zorn's lemma. But what can we say about this largest summbaility space? What would be the $\sigma$ values of elements in such a maximal summability space? Of course, it can not contain a constant sequence, since we will have a problem with translativity. Are there any other restrictions? These seem to me like a question natural direction to study, so I would be surprised if nobody studied it. – Serge the Toaster Oct 13 '22 at 16:28
  • 1
    Compare: "Banach limit" ... https://en.wikipedia.org/wiki/Banach_limit – Gerald Edgar Oct 16 '22 at 06:24
  • Maybe relevant: https://mathoverflow.net/questions/391387/did-anyone-ever-propose-the-distinction-between-divergent-to-infinity-as-oppos – Anixx Oct 17 '22 at 15:59
  • Also relevant: https://mathoverflow.net/questions/369378/more-or-less-universal-formula-for-regularization-of-divergent-integrals – Anixx Oct 17 '22 at 16:08
  • Also relevant: https://mathoverflow.net/questions/360074/is-regularization-of-infinite-sums-by-analytic-continuation-unique – Anixx Oct 17 '22 at 16:11
  • @GeraldEdgar the Banach limit has a disadvantage of being applicable only to bounded sequences. I am also suspicious regarding sequences hat have several different Banach limits (this indicated the concept is somewhat loosy). – Anixx Oct 17 '22 at 16:20
  • Banach limit is unique only if the sequence is 1st order Cesaro summable. Not quite powerful, I would say. – Anixx Oct 17 '22 at 16:25
  • @Anixx Thank you for your comments. In the links you attached, and in general, it looks like the main study of summation methods is by studying specific summation methods, and how they interact with classical results. I am interested in a more general point of view. For example, note that by the assumption I wrote, the value that $(1,-1,1,-1,...)$ would always be a half, but $(1,1,1,...)$ can't be contained in any summability space. So there is some rigidity to the definition of summability space.. – Serge the Toaster Oct 17 '22 at 17:25
  • @Anixx It looks like Banach limits are similar, but weaker as you mentioned. Another thing is that they don't quite behave as you expected when looking at series, i.e., Banach limit of partial sums. – Serge the Toaster Oct 17 '22 at 17:29
  • This also may be relevant: https://mathoverflow.net/questions/115743/an-algebra-of-integrals/342651#342651 – Anixx Oct 17 '22 at 17:32

2 Answers2

10

$\newcommand{\R}{\mathbb R}\newcommand{\N}{\mathbb N}\newcommand{\si}{\sigma}\newcommand{\SSS}{\mathcal S}\newcommand{\CC}{\mathcal C}\newcommand{\sh}{\operatorname{sh}}$First of all, as was noted in the previous comment, $\CC$ should be defined, not as the set of all convergent sequences in $\R^\N$, but as the set of all sequences in $\R^\N$ summable (say) in the sense that the corresponding sequence of partial sums is convergent.

Let now $T:=\sh$ and $\N_0:=\{0\}\cup\N$.

Partially ordering the summability spaces by inclusion and using Zorn's lemma, we see that there is a maximal summability space.

Actually, there are infinitely many maximal summability spaces.

Indeed, take any $t\in\R$. Let \begin{equation*} \SSS_t:=\text{span}(\CC\cup\{T^k b\colon k\in\N_0\}), \tag{1}\label{1} \end{equation*} where \begin{equation*} b:=(b_1,b_2,\dots)\quad\text{with}\quad b_n:=n!. \end{equation*}

Note that, for each nonzero sequence $a\in\CC$, the sequences $a,b,Tb,T^2b,\dots$ are linearly independent. To see why this is true, suppose that, to the contrary, $c_{-1} a+c_0 b+c_1 Tb+\dots+c_k T^kb=0$ for some $k\in\N_0$ and some real $c_{-1},\dots,c_k$ such that $c_k\ne0$. Then for all $n\in\N$ we have \begin{equation} c_{-1} a_n+c_0 n!+c_1(n+1)!+\dots+c_{k-1}(n+k-1)!+c_k(n+k)!=0. \end{equation} Letting $n\to\infty$, we get \begin{equation} c_{-1} a_n+c_0 n!+c_1(n+1)!+\dots+c_{k-1}(n+k-1)!=o((n+k)!) \end{equation} and hence $c_k=0$, a contradiction.

So, we have the summability space $(\SSS_t,\si_t)$ with $\SSS_t$ as in \eqref{1} and the linear functional $\si_t\colon\SSS_t\to\R$ defined by the conditions \begin{equation*} \si_t(a):=\sum_{n=1}^\infty a_n \end{equation*} for $a=(a_1,a_2,\dots)\in\CC$ and
\begin{equation*} \si_t(T^k b):=t-\sum_{n=1}^k b_n \end{equation*} for $k\in\N_0$ (so that $\si_t(b)=t$). (Added detail: Indeed, in view of \eqref{1} and because the sequences $a,b,Tb,T^2b,\dots$ are linearly independent for any nonzero $a\in\CC$, each sequence $s\in\SSS_t$ can be uniquely represented by the formula $s=a+c_0 b+c_1 Tb+\dots+c_k T^kb$ for some $a\in\CC$, some $k\in\N_0$, and some real $c_0,\dots,c_k$; and then $\si_t(s)=\si_t(a)+c_0 \si_t(b)+c_1 \si_t(Tb)+\dots+c_k \si_t(T^kb)$ by the linearity of $\si_t$.)

Clearly then, no summability space can contain two summability spaces of the form $(\SSS_t,\si_t)$ with two different values of $t\in\R$. On the other hand, by Zorn's lemma, each summability space of the form $(\SSS_t,\si_t)$ is contained in a maximal summability space $(\SSS^*_t,\si^*_t)$, and all these maximal summability spaces $(\SSS^*_t,\si^*_t)$ are distinct from one another.

Thus, as claimed, there are infinitely many maximal summability spaces and, therefore, as stated in the previous comment, there is no one largest summability space.

(This is probably why apparently "nobody studied" such a nonexistent entity.)

Iosif Pinelis
  • 116,648
  • Thank you for your detailed answer! A few questions: 1) Why is can you define $\sigma_T$ by defining it on $\mathcal{C}$ and on $\text{Span}{ T^k b}_k$? Why don't they have a non zero intersection? (I am sure this is a silly question but still I am not sure why). 2) People still study maximal ideals although not every ring is local, so I am still not sure why it won't be interesting to study such spaces. I can look at a maximal summability space and ask how it behaves and what are its elements. In particular, elements like $(1,-1,1,-1,...)$ will always have the same sum value – Serge the Toaster Oct 17 '22 at 14:58
  • @SergetheToaster : 1) I have added a detail on this. 2) You were "asking about the "best" summation method, which "unifies" all other summation method", about "a canonical largest summability space". The answer shows that there no such all-unifying method and there is no one canonical largest summability space. One could still study, I guess, the set (say $\mathfrak M$) of all maximal summability spaces; $\mathfrak M$ is shown in the answer to be infinite. However, it does not seem that your post was about the set $\mathfrak M$. – Iosif Pinelis Oct 17 '22 at 15:36
  • Thank you for you comment. Yet I am still not sure about 1. Take, for example, the sequence $a=(1,1/2,1/3,..)$. Then $c=a-T(a)$ is a telescopic sequence, and since $1/n \to 0$ then the sum of $c$ converges! Thus I am not sure why you can write every element in $\mathcal{S}_t$ uniquely as such a sum. – Serge the Toaster Oct 17 '22 at 16:02
  • @SergetheToaster : Why would you take this $a$? This $a$ has nothing to do with $\mathcal S_t$. As for the uniqueness, by the linearity, it suffices to check the uniqueness for $s=0$, but then it just follows by the linear independence of $a,b,Tb,T^2b,\dots$ (for nonzero $a\in\mathcal C$). – Iosif Pinelis Oct 17 '22 at 17:06
  • I see. So the key part here is that the choice of $b$ was such that $a, b, Tb, T^2b,....$ are linearly independent for every $a \in \mathcal{C}$. – Serge the Toaster Oct 17 '22 at 17:20
  • @SergetheToaster : That is right, for every nonzero $a\in\mathcal C$. – Iosif Pinelis Oct 17 '22 at 17:43
3

$\newcommand{\R}{\mathbb R}\newcommand{\N}{\mathbb N}\newcommand{\si}{\sigma}\newcommand{\SSS}{\mathcal S}\newcommand{\CC}{\mathcal C}\newcommand{\sh}{\operatorname{sh}}$In the previous answer, it was shown that there are infinitely many maximal summability spaces, in the sense of the OP. That was done using a very fast growing sequence $b\notin\CC$.

Here it will be shown that we can instead use any sequence $b=(b_1,b_2,\dots)\in\R^\N\setminus\CC$ with \begin{equation*} b_n\to0. \tag{0}\label{0} \end{equation*} Let $b$ be such a sequence indeed; for instance, $b_n=1/n$ will do.

Here we will keep the following notations from the previous answer: $\CC$ will denote the set of all sequences in $\R^\N$ summable in the sense that the corresponding sequence of partial sums is convergent, $T:=\sh$, and $\N_0:=\{0\}\cup\N$.

Take any $t\in\R$. Let \begin{equation*} \SSS_t:=\text{span}(\CC\cup\{T^k b\colon k\in\N_0\}). \end{equation*} Take any $s\in\SSS_t$. Then \begin{equation*} s=a+\sum_{j=0}^k c_j T^jb \end{equation*} for some $a=(a_1,a_2,\dots)\in\CC$, some $k\in\N_0$, and some real $c_0,\dots,c_k$. Let then \begin{equation*} \si_t(s):=\si_t(a)+\sum_{j=0}^k c_j \si_t(T^jb), \tag{2}\label{2} \end{equation*} where \begin{equation*} \si_t(a):=\sum_{n=1}^\infty a_n \tag{3}\label{3} \end{equation*} and \begin{equation*} \si_t(T^j b):=t-\sum_{n=1}^j b_n. \tag{5}\label{5} \end{equation*}

Then $\si_t$ is a well-defined linear functional on $\SSS_t$. To check this, it suffices to verify the implication (A)$\implies$(B), where (A) and (B) are the following conditions: \begin{equation*} 0=a+\sum_{j=0}^k c_j T^jb \tag{A}\label{A} \end{equation*} and \begin{equation*} 0=\si_t(a)+\sum_{j=0}^k c_j \si_t(T^jb), \tag{B}\label{B} \end{equation*} for any given $k\in\N_0$ and any given real $c_0,\dots,c_k$. Assume indeed that (A) holds. Then \begin{equation*} 0=a_n+\sum_{j=0}^k c_j b_{n+j} \end{equation*} for all $n\in\N$. Hence, for any $N\in\N$, letting \begin{equation*} s_N:=\sum_{n=1}^N b_n, \end{equation*} we have
\begin{equation*} \begin{aligned} 0&=\sum_{n=1}^N a_n+\sum_{j=0}^k c_j \sum_{n=1}^N b_{n+j} \\ &=\sum_{n=1}^N a_n+\sum_{j=0}^k c_j (s_{N+j}-s_j) \\ &=\sum_{n=1}^N a_n+\Big(\sum_{j=0}^k c_j\Big)s_N+\sum_{j=0}^k c_j(s_{N+j}-s_N-s_j). \end{aligned} \tag{10}\label{10} \end{equation*} To obtain a contradiction, suppose that $\sum_{j=0}^k c_j\ne0$. Letting now $N\to\infty$ and noting that, by \eqref{0}, $s_{N+j}-s_N\to0$ for each $j$, we see that \eqref{10} implies
\begin{equation*} s_N\to-\frac1{\sum_{j=0}^k c_j}\,\Big(\sum_{n=1}^\infty a_n-\sum_{j=0}^k c_j s_j\Big)\in\R \end{equation*} which contradicts the assumption $b\notin\CC$. So, $\sum_{j=0}^k c_j=0$ and hence, by \eqref{10}, \eqref{3}, and \eqref{5}, \begin{equation*} \begin{aligned} 0&=\sum_{n=1}^\infty a_n+\sum_{j=0}^k c_j(-s_j) \\ &=\sum_{n=1}^\infty a_n+\sum_{j=0}^k c_j t+\sum_{j=0}^k c_j(-s_j) \\ &=\sum_{n=1}^\infty a_n+\sum_{j=0}^k c_j(t-s_j) \\ &=\si_t(a)+\sum_{j=0}^k c_j \si_t(T^jb). \end{aligned} \end{equation*} So, the implication (A)$\implies$(B) is verified, which shows that $\si_t$ is indeed a well-defined linear functional on $\SSS_t$.

Now it is straightforward to complete the checking that $(\SSS_t,\si_t)$ is a summability space.

Clearly, no summability space can contain two summability spaces of the form $(\SSS_t,\si_t)$ with two different values of $t\in\R$. On the other hand, by Zorn's lemma, each summability space of the form $(\SSS_t,\si_t)$ is contained in a maximal summability space $(\SSS^*_t,\si^*_t)$, and all these maximal summability spaces $(\SSS^*_t,\si^*_t)$ are distinct from one another.

Thus, as claimed, there are infinitely many maximal summability spaces and, therefore, as stated in the previous comment, there is no one largest summability space.

Iosif Pinelis
  • 116,648
  • Interesting! Unless I am mistaken, this proof can generalized to prove the same result for the case where $b_n$ converges to some number (not necessarily zero), right? – Serge the Toaster Oct 19 '22 at 03:45
  • @SergetheToaster : Thank you for your appreciation. However, I don't think that such a generalization will work -- look at the latter multiline display. – Iosif Pinelis Oct 19 '22 at 11:06
  • I thought about it today, and in fact I think the opposite is true in the case when $b_n$ converges to a non zero number. That is, it can't be contained in any summability space, since if $b_n \to c$ then $b_n -c \to 0$ and you can look at the sequence $b-a$, where $a=(c,c,c,...)$. Constant sequences are, in some sense, unsummable, because they are stable under the shift operator. – Serge the Toaster Oct 19 '22 at 14:55
  • @SergetheToaster : The case $b_n\to c\ne0$ has now been considered, at https://mathoverflow.net/a/432870/36721 – Iosif Pinelis Oct 20 '22 at 21:04