8

Moved from Math Overflow due to not being regarded as a high degree of research

Note: I am looking in particular at real valued/real input functions at all values regardless of differentiability.

In this question a series of axioms or postulates governing calculus are proposed. Granted, that is abstract calculus rather than real number calculus.

Is there any known way to write a similar set of postulates governing real number calculus involving derivatives, integrals, and (ideally) allowing the construction of differential equations but with the following statement selected as one of the axioms without redundancy or contradiction?

"if and only if a function is constant does it have a derivative of 0 for all real numbers"

My ultimate purpose is to negate the aforementioned axiom and so having a complete set of axioms would make it convenient for me to convey the actual meaning behind negating the statement since one can ultimately fall back upon the statements similar to how we developed non-Euclidean geometry.

Some potential axioms that might be relevant that I thought of were:

"All elements of a derivation set are the inverse of the antiderivative where defined"

(might be better proposed as a conjecture) The derivation set of any function may not equal the empty set.

Update:

After discussing this with a few others more deeply, and noticing some non-uniqueness properties and things I've realized that the derivative need not be unique given the sort of things I would want to exist. Therefore, the following definitions deal with that issue:

A derivation set is a set of a functions that can potentially result from differentiation being applied to some function.

A derivative is an operator whose results from being applied to some function is some element of the derivation set for that function.

In this sense altered forms of derivatives would be solutions sets of functions that satisfy some equation rather than necessarily a unique operator. However, the equation itself is probably not something trivially apparent by my guess or something one could derive in a quick manner.

user64742
  • 2,207
  • 1
    You should probably start at the epsilon-delta method of limits. – Simply Beautiful Art Jan 01 '17 at 01:46
  • 1
    Not too much, but perhaps linear functions? – Simply Beautiful Art Jan 01 '17 at 01:51
  • I'm not good with axioms and all, but it is the case that if $P(x)$ is a polynomial with arbitrary rational exponents and is non-linear, then $Q(x)$ is the derivative of $P(x)$ for $x\in A$ iff $P(x)-Q(c)x-b$ has a root of multiplicity greater than or equal to $2$ at $x=c$ for all $c\in A$. This can then probably be extended to give statements about derivatives of analytic functions. – Simply Beautiful Art Jan 01 '17 at 02:09
  • I would mention that In normed vector spaces you define the derivative for a dense subset (e.g. the polynomials or the analytic functions) and extend it by linearity and norm closure. ($L^p, L^\infty$,$H^1$..) @SimpleArt – reuns Jan 01 '17 at 02:36
  • I do not see the connection. Note my comment requires $P(x)$ to be non-linear, so a constant function is actually beyond its grasp. – Simply Beautiful Art Jan 01 '17 at 02:57
  • 3
    The original MO question is http://mathoverflow.net/questions/258375/is-there-a-set-of-axioms-governing-calculus-that-include-this-particular-axiom-a – Gerry Myerson Jan 01 '17 at 14:55
  • See also http://mathoverflow.net/questions/44774/do-these-properties-characterize-differentiation and http://mathoverflow.net/questions/157847/algebraic-characterization-of-real-differentiation and https://en.wikipedia.org/wiki/K%C3%A4hler_differential –  Jan 14 '17 at 02:52

1 Answers1

7

Disclaimer: Not a full answer, my notation is bad, and I'm horrible with this axiom stuff. See to the end for a half-decent explanation of why this works and I hope it ends up useful.

If $P(x^d)$ is polynomial with integer exponents (including negative) and non-linear for some natural number $d$, then $Q(x)$ is the derivative of $P(x)$ over the domain if $f(x^d)$ has a root of multiplicity greater than or equal to $2$ at $x^d=c$, where $f(x)=P(x)-Q(c)x-b$ and $b=P(c)-Q(c)c$.

For example, if $P(x)=x^2$ and $Q(x)=2x$, then, notice that

$$f(x)=x^2-2cx+c^2=(x-c)^2$$

It has a root with multiplicity $2$ at $x=c$, so $Q(x)$ is the derivative of $P(x)$.

Similarly, if $P(x^2)=x$ with $x\in[0,+\infty)$ and $Q(x)=\frac12x^{-1/2}$, then

$$f(x^2)=x-\frac12c^{-1/2}x^2-\frac12c^{1/2}=-\frac12c^{-1/2}(x^2-c)^2$$

which has a root of multiplicity $2$ at $x^2=c$.

It is also not necessary that this be used merely for checking derivatives, for example, if $P(x)=x^3$, then

$$f(x)=x^3-Q(c)x-c^3+Q(c)c\equiv(x-a)(x-c)^2$$

this guarantees a root of multiplicity greater than or equal to $2$ at $x=c$. Upon expanding and such, you should find $Q(x)=3x^2$.

If one is able to derive that $Q(c)$ is unique from this, then the sum of derivatives rule is derivable.


So I thought of this when I first entered a Calculus class, whereupon I got to see the fundamental definition of a derivative in what I would call regular space or beginner space. The definition was basically that a derivative is the slope of a tangent line, which is by definition, the limit of secant lines.

But rather than evaluating limits, I looked at the more algebraic side of this and realized that if we are differentiating an algebraic function, then it is necessarily the case that I can solve for where the secant line intersects the original function. And by the conjugate roots theorem, I can deduce how many roots (not necessarily distinct) there are, or how many times it crosses the function.

But clearly, if we are taking the limit as two roots approach $x=c$, then it is the case that a tangent line should have a root of multiplicity equal to or greater than $2$ at $x=c$.

And then I modified this for arbitrary rational exponents and negative exponents.

This is also clearly correct for analytic functions thanks to Taylor's theorem, though I haven't seen it useful there since you can't "factor" functions that aren't polynomials.