0

Hi inspired by this question Prove that if $a+b =1$, then $\forall n \in \mathbb{N}, a^{(2b)^{n}} + b^{(2a)^{n}} \leq 1$. I propose this :

Let $a,b>0$ such that $a+b=1$ then we have : $$\Big(a^{(2b)^n}-b^{(2a)^n}\Big)^2\geq \Big(a^{(2b)^{n-1}}-b^{(2a)^{n-1}}\Big)\Big(a^{(2b)^{n+1}}-b^{(2a)^{n+1}}\Big)$$ Where $n\geq 2$ a natural number .

It's Turan-type inequality of the form :

$$\operatorname{P_n}^2(x)\geq \operatorname{P_{n-1}}(x)\operatorname{P_{n+1}}(x)$$

My work :

The trick : Since $g(x)=\ln\Big(|a^{(2b)^{x\pm 1}}-b^{(2a)^{x\pm 1}}|\Big)$ I guess (by a numerical routine)that $g(x)$ is concave $\forall x\in [2,\infty)$ and remains to apply the Jensen's inequality to get the desired result .

Implicit question :

How to prove that the function $g(x)$ is concave or have you an other way ?

Any helps is greatly appreciated .

Thanks a lot for your contributions !

0 Answers0