0

Let $p, q, r \in [1, \infty), r \neq \infty$ such that $1/p+1/q=1/r$. If $f \in {L(X)} ^ p$ and $g \in {L (X)}^q$. Is it true that $|f|^{p/r}<|f|^p$? According to I do not, because if I consider $f = 1/2$ constant and $p = 4 = q$ and $r = 2$ all in space $X = [0,1]$ then $ \int |f|^4 <\infty$ but $|1/2|^2 <|1/2|^ {4}$ something absurd. It is right?

eraldcoil
  • 3,508
  • 1
    What you have done is right but you are very likely to have copied the statement wrongly from somewhere. You can never expect an inequality for powers of the function from integrability. The inequality probably was stated for some norms of $f$ in $L^{p}$ and $L^{q}$. – Kavi Rama Murthy Jan 12 '19 at 00:02
  • Exactly the problem is: $p,q,r\in [1,\infty] with r\neq \infty$ such that $1/p+1/q=1/r$. If $f\in L^p$ and $g\in L^q$ then $|fg|_{r}\leq |f|_p|g|_q$ – eraldcoil Jan 12 '19 at 01:47
  • https://math.stackexchange.com/questions/159887/generalized-h%C3%B6lder-inequality –  Jan 12 '19 at 02:55
  • Thanks but I already solved the problem. My question is the question on the subject. – eraldcoil Jan 12 '19 at 02:57

0 Answers0