We will use the infinity or max norm on $V\times V$, that is, for $(u,v)\in V\times V$, $\lVert (u,v)\rVert_{V\times V}=\max\{\lVert u\rVert_V,\lVert v\rVert_V\}$, where the norm on $V$ is the norm induced by the inner product $\langle\cdot,\cdot\rangle$. I can't speak with authority on why this is equivalent to the product topology, but I don't think it should be too difficult to show for a finite product of normed spaces - those are some well-behaved constructs. For a function to be jointly continuous, it means simply that it is continuous in the product topology, rather than only being continuous in each variable separately, with the others fixed. That is, for $f:X\times Y\to Z$, $f$ is jointly continuous if, for all $(x_0,y_0)\in X\times Y$, given any $\epsilon>0$, there exists a $\delta>0$ such that, for all $(x,y)\in X\times Y$, $d_{X\times Y}((x,y),(x_0,y_0))<\delta$ implies $d_Z(f(x,y),f(x_0,y_0))<\epsilon$. So let's do it, using the induced norm as our metric.
Fix an $x_0,y_0\in V$, and let $\epsilon>0$ be given. Choose $\delta=\min\{1,\frac{\epsilon}{2(\lVert y_0\rVert + 1)},\frac{\epsilon}{2(\lVert x_0\rVert+1)}\}$, and note $\lvert\lVert y\rVert_V-\lVert y_0\rVert_V\rvert\leq \lVert y-y_0\rVert_V<1$, by the reverse triangle inequality and our choice of $\delta$. This implies $\lVert y\rVert_V<1+\lVert y_0\rVert_V$.
Let $x,y\in V$, and suppose $\lVert (x,y)-(x_0,y_0)\rVert_{V\times V}=\lVert(x-x_0,y-y_0)\rVert_{V\times V}<\delta$, implying $\lVert x-x_0\rVert_V<\delta$ and $\lVert y-y_0\rVert_V<\delta$. Then
$\lvert\langle x,y\rangle-\langle x_0,y_0\rangle\rvert = \lvert\langle x,y\rangle-\langle x_0,y\rangle+\langle x_0,y\rangle-\langle x_0,y_0\rangle\rvert$
$\leq \lvert\langle x,y\rangle-\langle x_0,y\rangle\rvert+\lvert\langle x_0,y\rangle-\langle x_0,y_0\rangle\rvert$
$= \lvert\langle x-x_0,y\rangle\rvert+\lvert\langle x_0,y-y_0\rangle\rvert$
$\leq \lVert x-x_0\rVert_V\lVert y\rVert_V + \lVert x_0\rVert_V \lVert y-y_0\rVert_V$ (by Cauchy-Bunyakovsky-Schwarz inequality)
$< (\frac{\epsilon}{2(\lVert y_0\rVert+1)})(1+\lVert y_0\rVert_V) + (\lVert x_0\rVert_V+1)(\frac{\epsilon}{2(\lVert x_0\rVert+1)})$, (since $\lVert x_0\rVert_V<\lVert x_0\rVert_V+1$, and by our choice of $\delta$)
$= \frac{\epsilon}{2}+\frac{\epsilon}{2} = \epsilon$.
Best Answer
First note that $$ N = \{a \in A \mid f(b^*a) = 0 \text{ for all }b \in A\} $$ since by Cauchy-Schwarz we have $$ \lvert f(b^*a) \vert^2 \leq f(b^*b)f(a^*a) = 0 $$ for all $a \in N$, $b \in A$. Furthermore, $N$ is a left ideal as you already noted.
Now if $a + N = a' + N$ and $b + N = b' + N$ then \begin{align*} \langle a+N, b+N \rangle &= \langle a-a'+N, b+N \rangle + \langle a'+N,b+N \rangle\\ &= \underbrace{f(b^*\underbrace{(a-a')}_{\in N})}_{=0} + \langle a'+N, b+N \rangle\\ &= \langle a'+N, b-b'+N \rangle + \langle a'+N, b'+N \rangle\\ &= f((b-b')^*a') + \langle a'+N, b'+N \rangle\\ &= \overline{f((a')^*(b-b'))} + \langle a'+N, b'+N \rangle\\ &= \langle a'+N, b'+N \rangle \end{align*} since for positive linear functionals $f:A\rightarrow \mathbb C$ we have $f(x^*) = \overline{f(x)} $ for all $x \in A$.