Recurrence vs Transience of Markov Chains
We say that an irreducible chain is recurrent, if the return time from some state state to itself is finite almost surely (and transient otherwise). Without loss of generality, you can pick any state, since if the chain is recurrent, any two states can get to each other in finite time.
Let's look at the two chains you gave:
Let $X_0 = 1$. To show that $\mathbb{P}(\tau < \infty) = 1$, let us note that if $X_1 = 2^n$, then $\tau = 2^n + 1$. So, $P(\tau > 2^n) = \sum_{i=n}^\infty 2^{-i} = \frac{1}{2^{n-1}}$. So, by Borel-Cantelli, we know that only finitely many of the events $ \{ A_{2^i} \}_{i=1}^\infty$ will happen, which means $\tau$ is finite almost surely. (Alternatively, you can observe that $P(\tau > 2^n) \rightarrow 0 $ as $n \rightarrow \infty$, and then noting that $\{ \tau =. \infty \} \subset \{ \tau > 2^n \}$ )
For the second chain, let $Y_0 = 1$. Now, $P(\tau > n) = \Pi_{i=1}^n \frac{i}{i+1} = \frac{1}{n+1}$. From here, with the argument as before, you can conclude recurrence.
Recurrent null vs positive recurrence
For this, we will look at the expected return time from some state, and we will say that a chain is recurrent null if the return time is infinite, and positive recurrent otherwise. Much like before, we can look at any state, since if the expected return time for one state is finite, it must be that it is finite for all the states that link to it. so let us do that for the two examples:
Let $X_0 = 1$ as before.
\begin{align*}
\mathbb{E}[\tau] &= \sum_i P(X_1 = i) \mathbb{E}[\tau \mid X_1 = i]\\
&= \sum_i 2^{-i} (2^i+1) \\
&\geq \sum_i 1 = \infty
\end{align*}
For the second one,
\begin{align*}
\mathbb{E}[\tau] &= \sum_i P(\tau > i) \\
&= \sum_i \frac{1}{i+1} \\
\end{align*}
which is a divergent sum.