The knowledge of a basis of a vector space $V$ allows us to reduce computations in $V$ to computations on matrices. If $B=\{v_1,\dots,v_n\}$ is a basis for $V$, we can define a map $C_B\colon V\to\mathbb{R}^n$ by setting
$$
C_B(v)=\begin{bmatrix}\alpha_1 \\ \vdots \\ \alpha_n\end{bmatrix}
\text{ if and only if }
v=\alpha_1v_1+\dots+\alpha_nv_n
$$
The map $C_B$ is linear and bijective, so a set $\{w_1,\dots,w_m\}$ of vectors in $V$ is linearly independent if and only if the set
$$
\{C_B(w_1),\dots,C_B(w_m)\}
$$
is linearly independent in $\mathbb{R}^n$. But for this we have Gaussian elimination and row echelon forms.
So, we take $B=\{1,x,x^2\}$, so that the set of vectors in $\mathbb{R}^3$ corresponding to $\{-1+x-2x^2,3+3x+6x^2,9\}$ is
$$
\left\{
\begin{bmatrix}-1\\1\\-2\end{bmatrix}\,,
\begin{bmatrix}3\\3\\6\end{bmatrix}\,,
\begin{bmatrix}9\\0\\0\end{bmatrix}\,
\right\}
$$
and you just need to find the row echelon form of the matrix
$$
\begin{bmatrix}
-1 & 3 & 9 \\
1 & 3 & 0 \\
-2 & 6 & 0
\end{bmatrix}
$$
Without pivot reduction, it turns out to be
$$
\begin{bmatrix}
-1 & 3 & 9 \\
0 & 6 & 9 \\
0 & 0 & 18
\end{bmatrix}
$$
which allows you to say that the first set is linearly independent.
The matrix to consider for case b is
$$
\begin{bmatrix}
1&0&-2&0\\
1&0&0&-3\\
0&1&2&0
\end{bmatrix}
$$
You'll easily find out that the fourth column is a linear combination of the first three, which are a linearly independent set.