Your question is a very natural one (if I understand it correctly) and at the same time it raises a rather difficult question.
As you say, it makes sense to identify the operators $A : X \to Y$ of finite rank with elements of $X^{\ast} \otimes Y$. Now you want the operator norm of $A$ to coincide with its norm in $X^{\ast} \otimes Y$ with respect to some tensor norm. I leave it to you to check that the injective tensor norm defined for $\omega = \sum x_{i}^{\ast} \otimes y_{i}$ by
$$\Vert \omega \Vert_{\varepsilon} = \sup_{\substack{\Vert \phi \Vert_{X^{\ast\ast}} \leq 1 \\\ \Vert \psi \Vert_{Y^{\ast}} \leq 1}}{\left\vert \sum \phi(x_{i}^{\ast}) \, \psi(y_{i})\right\vert}$$
does what you want (it is independent of the representation of $\omega$ as a finite sum of elementary tensors and $\|A\| = \|A\|_{\varepsilon}$ for operators of finite rank). Edit: To see the second claim, use Goldstine's theorem that allows you to replace the supremum over $\phi \in X^{\ast\ast}$ with $\Vert\phi\Vert_{X^{\ast\ast}} \leq 1$ by the supremum over $\operatorname{ev}_{x}$ with $x \in X$ and $\Vert x \Vert_{X} \leq 1$.
The projective tensor norm of a finite rank operator is usually much larger than its operator norm (see the discussion on pp.41ff in Ryan, for example).
Given this, we can identify the completion $X^{\ast} \otimes_{\varepsilon} Y$ of $X^{\ast} \otimes Y$ with respect to the injective tensor norm with a space $K_{0}(X,Y) \subset L(X,Y)$ of operators $X \to Y$ and we will freely do so from now on. Note that $K_{0}(X,Y)$ is nothing but the closure of the operators of finite rank in $L(X,Y)$.
Now the question is: What are the operators lying in $K_{0}(X,Y)$ ?
This is really difficult and I'll outline the closest I know to an answer to that.
As a first observation note that the compact operators $K(X,Y) \subset L(X,Y)$ are a closed subspace of $L(X,Y)$ containing $X^{\ast} \otimes_{\varepsilon} Y = K_{0}(X,Y)$. Looking at the examples of Hilbert spaces or the classical Banach spaces one finds out that quite often $K(X,Y) = K_{0}(X,Y)$ holds. However, it may fail in general, and that's where the famous Approximation Property comes in. I'll refrain from delving into the numerous equivalent formulations and use it as a black box. We have the following theorem due to Grothendieck:
Theorem. The Banach space $X^{\ast}$ has the approximation property if and only if $K_{0}(X,Y) = K(X,Y)$ holds for all Banach spaces $Y$.
Edit 2: (in response to a comment of the OP) It follows that for a reflexive Banach space $X$ with the approximation property we have $K(X,Y) = X^{\ast} \otimes_{\varepsilon} Y$ for all Banach spaces $Y$.
Now most of the Banach spaces you'll run into have the approximation property, e.g. $L^{p}$, $C(X)$ and so on. However, P. Enflo (in a veritable tour de force) has shown that there exist Banach spaces failing the approximation property. An explicit example (identified by Szankowski) is the space $L(H,H)$ of a separable Hilbert space. Note that this space is the dual space of the trace class operators. A famously open question is whether the space $H^{\infty}(D)$ of bounded holomorphic functions on the open unit disk has the approximation property.
I hope this answers your question. The approximation property is discussed in detail in any book that treats the tensor products of Banach spaces. In particular, this is well treated in Ryan's book.
The Hilbert tensor product is in general not equal to the projective tensor product:
If $H$ is a Hilbert space and $H^*$ its dual space,
then
- $H \hat \otimes_\pi H^*$ (the projective tensor product) is (isometrically isomorphic to)
the trace class (nuclear) operators with the trace norm
- $H \hat \otimes_\epsilon H^*$ (the injective tensor product) is (isometrically isomorphic to)
the compact operators with the operator norm
- $H \hat \otimes_h H^*$ (the Hilbert tensor product, which is a Hilbert space again) is (isometrically isomorphic to) the Hilbert-Schmidt operators with the Hilbert-Schmidt norm
Since the spaces of Hilbert-Schmidt, compact and trace class operators are, in infinite dimensions, never the same it follows that the tensor norms can not be the same either.
To show that they are never the same in infinite dimensions:
Let $(e_n)_{n \in \mathbb{N}}$ be an orthonormal system in $H$.
Let $(x_n)_{n \in \mathbb{N}}$ be any sequence of complex numbers that converges to 0, but whose absolute value is not square summable. For example $x_n =1/\sqrt{n}$.
Define a linear operator $T: H\to H$ by $Ty = \sum_{n \in \mathbb{N}} x_n \langle e_n, y \rangle e_n$. Then $T$ is compact, but not Hilbert-Schmidt.
Let $(s_n)_{n \in \mathbb{N}}$ be any sequence of complex numbers whose absolute value is square summable, but not summable.
For example $s_n := 1/n$.
Define a linear operator $S : H \to H$ by $Sy = \sum_{n \in \mathbb{N}} s_n \langle e_n, y \rangle e_n$. Then $S$ is Hilbert-Schmidt, but not trace class.
Best Answer
Intuitively, this requirement ensures that V $\otimes$ W, combined with the norm $\|v \otimes w\|_{V \otimes W}$, is a Banach space, as long as V and W are finite-dimensional.
Banach spaces are normed vector spaces that are closed under limit. In other words, if you take any list of vectors from some vector space $V: \{v_1, v_2, v_3,..\}$ that gets arbitrarily close to some other vector $l$ (defining distance with the norm $\|v\|_V$), then $l$ will be in $V$. We can formalize this by saying as $i$ goes to infinity, $\|l - v_i\|_V$ gets closer and closer to 0: $$\lim_{i \to \infty} \|l - v_i\|_V = 0$$
Say we have two such spaces (we'll call them $V$ and $W$.) If we have a converging sequence $\{v_1, v_2, v_3,..\}$ in $V$-space, it is natural to assume that the sequence $\{v_1 \otimes w, v_2 \otimes w, v_3 \otimes w,..\}$ should also converge (for some arbitrary $w$ from $\textbf{W}$.)$^{[1]}$ Moreso, it should converge to $l \otimes w$. Using our above notation, we can write this as $$\lim_{i \to \infty} \|l \otimes w - v_i \otimes w\|_{V \otimes W} = 0$$ However, no metric exists by default on the tensor products $v \otimes w$, so this doesn't have to be true!
Let's take another look at your inequality: $$\|v \otimes w\|_{V \otimes W} \leq \|v\|_V \|w\|_W$$ Take any Cauchy sequence $\{v_i\}$ from Banach space $\textbf{V}$ and replace $v$ with $l - v_i$. That gives us $$\|(l-v_i) \otimes w\|_{V \otimes W} \leq \|l-v_i\|_V \|w\|_W$$ or, using the bilinearity of the tensor product, $$\|l \otimes w -v_i \otimes w\|_{V \otimes W} \leq \|l-v_i\|_V \|w\|_W$$ As $i\to\infty$, the right side goes to 0. Since norms are necessarily non-negative, this means the left side must also go to zero. But this is $\textit{exactly the definition}$ of what it means for $\{v_1 \otimes w, v_2 \otimes w, v_3 \otimes w,..\}$ to converge to $l \otimes w$!
Furthermore, since $l \in V$ (by the definition of a Banach space), we know $l \otimes w$ must be a vector in $V \otimes W$. Since we picked $\{v_i\}$ arbitrarily (it could be any Cauchy sequence in $V$), any converging sequence in $V$ gives a corresponding converging sequence in $V \otimes W$, and the limit $l \otimes w$ exists in $V \otimes W$. In other words, $V \otimes W$ is a Banach space. This holds iff$^{[2]}$ the distance norm we choose fits the inequality.
$\\$ $\\$
$[1]$: The flipside is also true: a converging sequence $\{w_1, w_2, w_3,..\}$ in $W$ should imply the sequence $\{v \otimes w_1, v \otimes w_2, v \otimes w_3,..\}$ converges to $v \otimes l_w$ for any $v\in\textbf{V}$. Otherwise we could just write your inequality as$\|v \otimes w\|_{V \otimes W} \leq \|v\|_V$.
$[2]$: Necessity comes from considering the inverse. If we are allowed to choose a metric s.t. $\|v \otimes w\|_{V \otimes W} > \|v\|_V \|w\|_W$, then not only does $\{v_1 \otimes w, v_2 \otimes w, v_3 \otimes w,..\}$ not converge to $l \otimes w$, it might converge to something that isn't even in $V \otimes W$ -- which would be bad news for our hopes of a Banach space.
EDIT: Just realized I forgot to address the general Cauchy sequence in $V \otimes W$. Since every vector $x\in V \otimes W$ can be written as $c v \otimes \frac{1}{c^*} w$ for nonzero $c\in \mathbb{C}$, we can say that $\{v_1 \otimes w_1, v_2 \otimes w_2, v_3 \otimes w_3,..\}$ should converge to $l_v \otimes l_w$. This leads to $$\lim_{i\to\infty} \|l_v \otimes l_w - v_i \otimes w_i \|_{V \otimes W} = 0$$ The given inequality proves the equivalent statement $$\lim_{i\to\infty} \|v_i \otimes w_i - l_v \otimes l_w \|_{V \otimes W} = 0$$ when we consider the expansion of two Cauchy sequences: $\{v_1, v_2, v_3,..\}$ in $V$ and $\{w_1, w_2, w_3,..\}$ in $W$. Using the inequality: $$\|(l_v-v_i) \otimes (l_w - w_i)\|_{V \otimes W} \leq \|l_v-v_i\|_V \|l_w - w_i\|_W$$ We can expand the tensor product to $$\|l_v \otimes l_w - l_v \otimes w_i - v_i \otimes l_w + v_i \otimes w_i\|_{V \otimes W} \leq \|l_v-v_i\|_V \|l_w - w_i\|_W$$ By the case shown in the main answer, we know the middle two tensor products converge to $l_v \otimes l_w$. The left side then becomes the "equivalent statement" above, and the limit $i\to\infty$ goes to 0 with similar logic (non-negative and less or equal to a statement whose limit is 0.)