What is true is that if $V$ is a Euclidean space, and $\beta=\{\mathbf{e}_1,\ldots,\mathbf{e}_n\}$ is an orthonormal basis, then for any vectors $v$ and $w$ we will have
$$\tau(v,w) = [v]_{\beta}\cdot [w]_{\beta},$$
where $[x]_{\beta}$ is the coordinate vector with respect to the basis $\beta$.
To see this, note that if $v=\alpha_1\mathbf{e}_1+\cdots+\alpha_n\mathbf{e}_n$ and $w = a_1\mathbf{e}_1+\cdots+a_n\mathbf{e}_n$, then
$$\begin{align*}
\tau(v,w) &= \tau(\alpha_1\mathbf{e}_1+\cdots+\alpha_n\mathbf{e}_n,a_1\mathbf{e}_1+\cdots+a_n\mathbf{e}_n)\\
&= \sum_{i=1}^n\sum_{j=1}^n\tau(\alpha_i\mathbf{e}_i,a_j\mathbf{e}_j\\
&= \sum_{i=1}^n\sum_{j=1}^n \alpha_ia_j\tau(\mathbf{e}_i,\mathbf{e}_j)\\
&= \sum_{i=1}^n\sum_{j=1}^n \alpha_ia_j\delta_{ij} &\text{(Kronecker's }\delta\text{)}\\
&= \alpha_1a_1+\cdots+\alpha_na_n\\
&= (\alpha_1,\ldots,\alpha_n)\cdot (a_1,\ldots,a_n)\\
&= [v]_{\beta}\cdot [w]_{\beta}.
\end{align*}$$
However, in terms of the standard basis for $V$, the inner product may "look" different.
While it is not true that every positive definite symmetric bilinear form on $\mathbb{R}^n$ is equal to the standard dot product, it is true that $(\mathbb{R}^n,\tau)$ will be isomorphic to $\mathbb{R}^n$ with the standard dot product; that is, there exists a linear transformation $T\colon\mathbb{R}^n\to\mathbb{R}^n$ that is invertible, and such that for any $v,w\in\mathbb{R}^n$, $\tau(v,w) = T(v)\cdot T(w)$; namely, pick an orthonormal basis for $(\mathbb{R}^n,\tau)$ and let $T$ be the map that sends $v$ to its coordinate vector with respect to that basis.
Your question essentially reduces the to spectral theorem for symmetric bilinear forms. Use $\theta$, the positive definite form, as an inner-product. This makes $(V,\theta)$ a (real) inner product space, and hence spectral theorem applied to $\psi$ will give you an answer.
For a sketch of the proof of the spectral theorem, what we can do is to look at the set of all vectors $S := \{ v\in V| \theta(v,v) = 1\}$. Note that by positive definiteness every vector $w\in V$ can be written as a multiple of some $s\in S$. In fact, $S$ is a topological sphere and is compact. So we can let $e_1$ be a vector in $S$ such that $\psi(e_1,e_1) = \inf_S \psi(s,s)$. Let $S_1 = S \cap \{e_1\}^\perp$ where $\perp$ is defined relative to $\theta$. We can define $e_2$ as a vector in $S_1$ such that $\psi(e_2,e_2) = \inf_{S_1} \psi(s,s)$ and so on. By induction we will have arrived at a collection of vectors which are orthonormal with respect to $\theta$. That they are also $\psi$-orthogonal follows by minimization: if there exists $s\in S_1$ such that $\psi(e_1,s) \neq 0$, we have that for $a^2 + b^2 = 1$
$$ \psi(a e_1 + b s, a e_1 + b s) = a^2 \psi(e_1,e_1) + b^2 \psi(s,s) + 2ab \psi(e_1,s) = \psi(e_1,e_1) + b^2 (\psi(s,s) - \psi(e_1,e_1) + 2ab \psi(e_1,s)$$
By choosing $|b| < 1/2$ sufficiently small such that
$$ \left|\frac{1}{b}\right| > \left|\frac{\psi(s,s) - \psi(e_1,e_1)}{\psi(e_1,s)}\right| $$
and with the appropriate sign, we see we can make
$$ \psi(a e_1 + bs, a e_1 + bs) < \psi(e_1,e_1) $$
contradicting the minimisation assumption. By induction the same can be said of all $e_i$, and hence they are mutually orthogonal relative to $\psi$.
It is important to note that the assumption that $\theta$ is positive definite is essential. In the proof above we used the fact that for a positive definite form, its corresponding "unit sphere" is a topological sphere and is a compact set in $V$. For an indefinite form or a degenerate form, the corresponding "sphere" would be non-compact (imagine some sort of hyperboloid or cylinder), and hence it can happen that the infimum of a continuous function on the surface is not achieved, breaking the argument.
In fact, given two symmetric bilinear forms without the assumption that at least one of them is positive definite, it is possible that they cannot be simultaneously diagonalised. An example: let
$$ \theta = \begin{pmatrix} 1 & 0 \\ 0 & -1\end{pmatrix} \qquad \psi = \begin{pmatrix} 1 & -1 \\ -1 & -1\end{pmatrix}$$
Suppose you want $(x,y)$ and $(z,w)$ to simultaneously diagonalise the matrices. This requires in particular
$$ xz = wy \qquad xz - wy - xw - zy = 0 $$
for the cross terms to vanish. Hence we have
$$ xw + zy = 0 $$
Assuming $x \neq 0$ (at least one of $x,y$ is non zero), we solve by substitution $z = wy / x$ which implies $w(x^2 + y^2) = 0$. Since $x^2 + y^2 \neq 0$ if $(x,y)$ is not the zero vector, this means $w = 0$. But the equation $xz = wy = 0$ implies that $xz = 0$. By assumption this implies $z = 0$ and hence $(z,w)$ is the zero vector, which contradicts our assumption.
A similar proof can be used to show that
$$ \theta = \begin{pmatrix} 0 & 1 \\ 1 & 0\end{pmatrix} \qquad \psi = \begin{pmatrix} 1 & 2 \\ 2 & 0 \end{pmatrix} $$
also cannot be simultaneously diagonalised.
Best Answer
This is not true in characteristic $2$. Let $\Bbb F$ be any field of that characteristic, and, for example, take $V = \Bbb F^2$ and the bilinear form $H$ with matrix representation $$[H] = \pmatrix{0&1\\1&0}$$ with respect to the standard basis; then, the quadratic form $Q_H : {\bf x} \mapsto H({\bf x}, {\bf x})$ is the zero form. Indeed, we can see directly that $H$ is not diagonalizable: Computing directly for any $P \in \textrm{GL}(2, \Bbb F)$ gives $$P^T [H] P = (\det P) [H],$$ which is not diagonal.
Put another way, it is not true in characteristic $2$ that the map $H \mapsto Q_H$ is injective. In other characteristics it is injective, as we can recover $H$ from $Q$ via the Polarization Identity $$H({\bf x}, {\bf y}) = \tfrac{1}{4}[Q_H({\bf x} + {\bf y}) - Q_H({\bf x} - {\bf y})] ,$$ but in characteristic $2$ one cannot divide by $4$ (which in that setting coincides with $0$).
Remark The above facts imply (since the spaces of symmetric bilinear forms on $V$ and quadratic forms on $V$ both have dimension $\frac{1}{2} (\dim V)(\dim V + 1)$) that (only) in characteristic $2$ there are quadratic forms that are not induced by symmetric bilinear forms (that is, are not in the image of the map $H \mapsto Q_H$); the simplest example is $V = \Bbb F^2$ and $$\pmatrix{x\\y} \mapsto x y .$$