Let $g: (0,\infty)\rightarrow\mathbb{R}$ satisfies $\lim_{x\to 0}g(x)=0$ and $\lim_{x\to 0} \frac{g(x)-g(\frac{x}{2})}{\sqrt{x}}=1$. Show that $$\lim_{x\to 0}\frac{g(x)}{\sqrt{x}}=2+\sqrt{2}$$
Here is what I think about.
If I let $l= \lim_{x\to 0}\frac{g(x)}{\sqrt{x}}$ then I can find that $l=2+\sqrt{2}$. Because it is likely to calculate. It is not proving.
For showing this, I use definition of limit
Given $\epsilon>0,\exists \delta>0$ Such that $0<|x-0|<\delta$ And $$\left|\frac{g(x)-g(\frac{x}{2})}{\sqrt{x}}-1\right|<\epsilon$$
Then I don’t know how can I do more.Thank in advance!