When the matrix is in echelon form, for each non-zero row $R_i$, you can divide the row by its leading non-zero value. that makes the leading value $1$. Next for each row $R_n$ above $R_i$, you can subtract $R_i$ multiplied by the entry in row $R_n$ in the same column as that leading $1$ from $R_n$. This results in $R_n$ having $0$ in that column. If you follow this procedure starting with the first row and going down, by the time you are done, the matrix will be reduced echelon form, and guess what! Those leading $1$s that define the pivot points are exactly the locations of the leading non-zero values in each row back before it was reduced.
I.e., When a matrix is in echelon form, the pivot points are exactly the leading non-zero values in each row. Quite frankly, if I had written the definition, that's how I would have defined it, since the two are equivalent, and you need to know them before you get in reduced echelon form.
For example, in your matrix, I marked the leading non-zero entries in red:
$$\begin{bmatrix}
\color{red}1 &4 &5 &-9 &7 \\
0 &\color{red}2 &4 &-6 &-6 \\
0 &0 &0 &\color{red}{-5} &0 \\
0 &0 &0 &0 &0 \\
\end{bmatrix}$$
First divide each row by its leading non-zero value
$$\begin{bmatrix}
\color{red}1 &4 &5 &-9 &7 \\
0 &\color{red}1 &2 &-3 &-3 \\
0 &0 &0 &\color{red}1 &0 \\
0 &0 &0 &0 &0 \\
\end{bmatrix}$$
Subtract $4$ times Row 2 from Row 1:
$$\begin{bmatrix}
\color{red}1 &0 &-3 &3 &19 \\
0 &\color{red}1 &2 &-3 &-3 \\
0 &0 &0 &\color{red}1 &0 \\
0 &0 &0 &0 &0 \\
\end{bmatrix}$$
Subtract $3$ times row 3 from row 1, and add 3 times row 3 to row 2:
$$\begin{bmatrix}
\color{red}1 &0 &-3 &0 &19 \\
0 &\color{red}1 &2 &0 &-3 \\
0 &0 &0 &\color{red}1 &0 \\
0 &0 &0 &0 &0 \\
\end{bmatrix}$$
And now, we are in reduced echelon form. See that the the pivot points from the definition are in the same locations as the echelon from leading non-zero values.
Let $ S = \{(1, 0, 0), (0, 1, 0)\} $. Then, $ S $ is linearly independent, as is easily seen, on the other hand $ (0, 0, 1) \notin \textrm{span}\, S $, therefore $ S $ does not span $ \mathbb{R}^3 $.
On the other hand, your intuition is partly correct due to the following result:
Theorem. Let $ V $ be a vector space, $ L $ a linearly independent subset and $ S $ a subset that spans $ V $. Then, $ |L| \leq |S| $.
Proof. Let $ S = S_0 = \{ s_i : 1 \leq i \leq n \} $ and let $ L = \{ b_i : 1 \leq i \leq m \} $. We construct a sequence of spanning sets. Given $ S_k $, construct the set $ S_{k+1} $ as follows: $ S_k $ is a spanning set, therefore we may write $ b_k = \sum c_i {s_k}_i $ where $ {s_k}_i \in S_k $ and the $ c_i $ are members of the field of scalars. As $ L $ is linearly independent, there must be a vector on the right hand side which is not an element of $ L $, let one such vector be ${s_k}_j$. Therefore, removing $ {s_k}_j $ from the set $S_k$ and replacing it with $ b_k $ gives us a spanning set with the same number of elements as $ S_k $ (as ${s_k}_j$ can be expressed as a linear combination of the elements of this new set). Define this set to be $ S_{k+1} $.
With this construction, a new element of $ L $ is added to the sets $ S_i $ at each step, however the cardinality of the sets remains unchanged. The construction halts at $ S_m $, which contains all elements of $ L $, therefore $ L \subseteq S_m $ and $|L| \leq |S_m| = |S|$, which establishes the result.
Corollary. Let $ V $ have dimension $ n $ over its field of scalars and let $ L $ be a linearly independent subset of $ V $ which has $ n $ elements. Then, $ L $ is a basis of $ V $.
Proof. Let $ B $ be a basis for $ V $, then $ |B| = n $. Consider the set $ L' = L \cup \{v\} $ for any $ v \in V $ and $ v \notin L $. This set has $ n+1 $ elements. However, any linearly independent subset of $ V $ can have at most $ n $ elements by the above theorem, as $ B $ is a spanning subset. Therefore, $ L' $ is linearly dependent, and in particular $ v $ can be expressed as a linear combination of the elements of $ L $ (otherwise $ L $ would be linearly dependent), which establishes that $\textrm{span}\, L = V $. By definition of a basis, $ L $ is a basis of $ V $.
Therefore, if your linearly independent subset has as many elements as the dimension of your vector space. then it has to span your space.
Best Answer
The better question is, for what $h$ values does the set of vectors not span $\mathbb R^3$? And to find the answer to this question, note that the vectors don't span $\mathbb R^3$ iff they are linearly dependent.
If they are linearly dependent, there exists some $x,y$ such that $a_1=xa_2+ya_3$. Hence, we have the following relations: $$4x+y=-1,\;-2x-6y=h,\;5x+2y=7$$So, $x=-3$, $y=11$, and $h=-60$.
So, the set of vectors spans $\mathbb R^3$ iff $h\neq-60$.