Note: The following is not an answer, but merely some thoughts which
might or might not be helpful to you.
First note that you confused(?) your inequality signs. I think you
want
$$
\gamma_{n}\left(\left\{ x\in\mathbb{R}^{n}\,\mid\,\left\Vert x\right\Vert ^{2}\geq\frac{n}{1-\varepsilon}\right\} \right){\color{red}\leq}e^{-\varepsilon n/4}
$$
and
$$
\gamma_{n}\left(\left\{ x\in\mathbb{R}^{n}\,\mid\,\left\Vert x\right\Vert ^{2}\geq\frac{{\rm Trace}\left(\Sigma\right)}{1-\varepsilon}\right\} \right){\color{red}\leq}e^{-\varepsilon n/4}.
$$
Also note that this inequality would get better with larger values
of $n$. But in general, this is not true. To see this, use e.g.
$$
\Sigma=\left(\begin{matrix}1\\
& 0\\
& & \ddots\\
& & & 0
\end{matrix}\right),
$$
or if you want your $\Sigma$ to be positive semidefinite, use $\frac{1}{L\left(n-1\right)}$
instead of the zeros on the diagonal, where $L$ is large. Your estimate
would then imply (since $\left\Vert x\right\Vert ^{2}\geq\left|x_{1}\right|^{2}$)
that
$$
\mathbb{P}\left(\left|x_{1}\right|^{2}\geq\frac{1+\frac{1}{L}}{1-\varepsilon}\right)\leq\mathbb{P}\left(\left\Vert x\right\Vert ^{2}\geq\frac{1+\frac{1}{L}}{1-\varepsilon}\right)\leq e^{-\varepsilon n/4}\xrightarrow[n\to\infty]{}0,
$$
which is absurd.
Hence, the (exponent of the) right hand side of your estimate somehow
needs to involve ${\rm trace}\left(\Sigma\right)$ instead of $n$
(I think).
What follows is an adaptation of the argument you linked, but I get
eventually stuck when I try to optimize the/find a good value of $\lambda$.
First, since $\Sigma$ is symmetric positive semidefinite, there is
an orthogonal matrix $O\in\mathbb{R}^{n\times n}$ with $\Sigma=O \cdot {\rm diag}\left(\lambda_{1},\dots,\lambda_{n}\right)\cdot O^{T}$,
where $\lambda_{1},\dots,\lambda_{n}\geq0$ are the eigenvalues of
$\Sigma$. We can now define the square root $\sqrt{\Sigma}:=O\cdot {\rm diag}\left(\sqrt{\lambda_{1}},\dots,\sqrt{\lambda_{n}}\right) \cdot O^T\in\mathbb{R}^{n\times n}$
which satisfies $\sqrt{\Sigma}^{T}=\sqrt{\Sigma}$ and $\sqrt{\Sigma}\sqrt{\Sigma}=\Sigma$.
Now, by well-known properties of the normal distribution, we conclude
that $X:=\sqrt{\Sigma}g\sim N\left(0,\Sigma\right)$, where $g\sim N\left(0,{\rm id}\right)$
is a standard normal distributed random variable.
We also know that the standard normal distribution is invariant under
orthogonal transformations, i.e. $h:=O^{T}g\sim N\left(0,{\rm id}\right)$.
Finally,
$$
\left\Vert X\right\Vert ^{2}=\left\Vert O{\rm diag}\left(\sqrt{\lambda_{1}},\dots,\sqrt{\lambda_{n}}\right)O^{T}g\right\Vert ^{2}=\left\Vert {\rm diag}\left(\sqrt{\lambda_{1}},\dots,\sqrt{\lambda_{n}}\right)h\right\Vert ^{2}=\sum_{i=1}^{n}\lambda_{i}h_{i}^{2},
$$
so that $\left\Vert X\right\Vert ^{2}$ has (as you noted yourself) expectation
$$
\mathbb{E}\left\Vert X\right\Vert ^{2}=\sum_{i=1}^{n}\lambda_{i}\mathbb{E}h_{i}^{2}=\sum_{i=1}^{n}\lambda_{i}={\rm trace}\left(\Sigma\right),
$$
since $\mathbb{E}h_{i}^{2}={\rm Var}\left(h_{i}\right)=1$, since
$h\sim N\left(0,{\rm id}\right)$.
By reordering, we can assume $\lambda_{1}\geq\dots\geq\lambda_{j}>0=\lambda_{j+1}=\dots=\lambda_{n}$,
where $j\in\left\{ 0,\dots,n\right\} $.
Now observe that the Markov/Chebyscheff inequality yields, for arbitrary
$\lambda>0$,
\begin{eqnarray*}
\mathbb{P}\left(\left\Vert X\right\Vert ^{2}\geq{\rm trace}\left(\Sigma\right)+\delta\right) & = & \mathbb{P}\left(e^{\lambda\left\Vert X\right\Vert ^{2}}\geq e^{\lambda\left({\rm trace}\left(\Sigma\right)+\delta\right)}\right)\\
& \leq & e^{-\lambda\left({\rm trace}\left(\Sigma\right)+\delta\right)}\cdot\mathbb{E}\left(e^{\lambda\left\Vert X\right\Vert ^{2}}\right),
\end{eqnarray*}
where
\begin{eqnarray*}
\mathbb{E}\left(e^{\lambda\left\Vert X\right\Vert ^{2}}\right) & = & \mathbb{E}\left(e^{\sum_{i=1}^{n}\lambda\lambda_{i}h_{i}^{2}}\right)\\
& = & \prod_{i=1}^{j}\mathbb{E}\left(e^{\lambda\lambda_{i}h_{i}^{2}}\right),
\end{eqnarray*}
by stochastic independence of $\left(h_{1},\dots,h_{n}\right)$. The
main point of the introduction of the $e^{\dots}$ term is this final
identity, where we can pull the product out of the expectation by
independence.
Finally,
\begin{eqnarray*}
\mathbb{E}\left(e^{\gamma h_{i}^{2}}\right) & = & \frac{1}{\sqrt{2\pi}}\cdot\int_{\mathbb{R}}e^{\gamma x^{2}}\cdot e^{-x^{2}/2}\,{\rm d}x\\
& = & \frac{1}{\sqrt{2\pi}}\cdot\int_{\mathbb{R}}e^{-\left(\sqrt{\frac{1}{2}-\gamma}x\right)^{2}}\,{\rm d}x\\
& \overset{\omega=\sqrt{\frac{1}{2}-\gamma}x}{=} & \frac{1}{\sqrt{2\pi}\cdot\sqrt{\frac{1}{2}-\gamma}}\cdot\int_{\mathbb{R}}e^{-\omega^{2}}\,{\rm d}\omega\\
& = & \frac{1}{\sqrt{1-2\gamma}}
\end{eqnarray*}
for $\gamma<\frac{1}{2}$.
All in all, we arrive at
$$
\mathbb{P}\left(\left\Vert X\right\Vert ^{2}\geq{\rm trace}\left(\Sigma\right)+\delta\right)\leq e^{-\lambda\left({\rm trace}\left(\Sigma\right)+\delta\right)}\cdot\prod_{i=1}^{j}\frac{1}{\sqrt{1-2\lambda\lambda_{i}}}.
$$
The problem is now to optimize this w.r.t. $0<\lambda<\frac{1}{2\lambda_{1}}$.
One way to simplify(?) this is to use
$$
e^{-\lambda\left({\rm trace}\left(\Sigma\right)+\delta\right)}\cdot\prod_{i=1}^{j}\frac{1}{\sqrt{1-2\lambda\lambda_{i}}}=e^{-\left[\lambda\left({\rm trace}\left(\Sigma\right)+\delta\right)-\frac{1}{2}\sum_{i=1}^{j}\ln\left(1-2\lambda\lambda_{i}\right)\right]},
$$
where one only has to optimize the exponent. Still, I neither see
an easy way to determine the optimal value of $\lambda$, nor a really
convenient choice of $\lambda$.
One choice inspired by your linked lecture notes is to use $\lambda=\frac{\delta/2}{{\rm trace}\left(\Sigma\right)+\delta}$
(because in the standard gaussian case, we have $n={\rm trace}\left(\Sigma\right)$,
which is exactly the choice used in the lecture notes). This would
yield
\begin{eqnarray*}
\mathbb{P}\left(\left\Vert X\right\Vert ^{2}\geq{\rm trace}\left(\Sigma\right)+\delta\right) & \leq & e^{-\delta/2}\cdot\prod_{i=1}^{j}\sqrt{\frac{{\rm trace}\left(\Sigma\right)+\delta}{{\rm trace}\left(\Sigma\right)+\delta-\delta\lambda_{i}}},
\end{eqnarray*}
which still does not really seem that great.
I will try to find a good choice of $\lambda$ here. If I come up with something, I will edit the post.
Let $$\frac{\Sigma_d}{\operatorname{Tr}\Sigma_d} = \mathrm{diag}(a_1, a_2, \cdots, a_d).$$
By Jensen's inequality, we have
$$\mathbb{E}\left[\frac{1}{\|x\|^4}\right]
\ge \frac{1}{(\mathbb{E}[\|x\|^2])^2} = 1. \tag{1}$$
Using the known identity ($q > 0$)
$$\frac{1}{q^2} = \int_0^\infty t\, \mathrm{e}^{-tq} \, \mathrm{d} t,$$
we have
\begin{align*}
\mathbb{E}\left[\frac{1}{\|x\|^4}\right] &= \mathbb{E}\left[ \int_0^\infty t\,\mathrm{e}^{-t\|x\|^2} \,\mathrm{d} t \right]\\
&= \int_0^\infty t\, \mathbb{E}[\mathrm{e}^{-t\|x\|^2}]\, \mathrm{d} t \\
&= \int_0^\infty t\, \prod_{i=1}^d \mathbb{E}[\mathrm{e}^{-tx_i^2}]\, \mathrm{d} t\\
&= \int_0^\infty t\, \prod_{i=1}^d \frac{1}{\sqrt{1 + 2a_i t}} \,\mathrm{d} t \\
&= \int_0^\infty t\, \mathrm{e}^{-\frac12\sum_{i=1}^d \ln (1 + 2a_i t)} \,\mathrm{d} t \tag{2}
\end{align*}
where we use
$\mathbb{E}[\mathrm{e}^{-tx_i^2}] = \int_{-\infty}^\infty \mathrm{e}^{-ty^2}\cdot \frac{1}{\sqrt{2\pi a_i}}\mathrm{e}^{-\frac{y^2}{2a_i}}\,\mathrm{d} y = \frac{1}{\sqrt{1 + 2a_i t}}$.
Fact 1: $\ln(1 + 2a_i t) \ge \frac{1}{i}\ln (1 + 2a_1 t)$ for all $i$ and all $t \ge 0$.
(Proof: Note that $a_i = a_1/i$. Let $f(t) = \ln(1 + 2a_i t) - \frac{1}{i}\ln (1 + 2a_1 t)$. We have $f'(t) = \frac{4a_1^2 t (i - 1)}{i(2a_1t + i)(2a_1 t + 1)} \ge 0$.
Also, $f(0) = 0$. The desired result follows.)
Denote $H_d = \sum_{i=1}^d \frac{1}{i}$. By Fact 1, we have
$$-\frac12\sum_{i=1}^d \ln (1 + 2a_i t)
\le -\frac12 \ln(1 + 2a_1 t) \cdot \sum_{i=1}^d \frac{1}{i}
\le - \frac12 H_d\ln(1 + 2t/H_d). \tag{3}$$
From (3), we have
$$
\int_0^\infty t\, \mathrm{e}^{-\frac12\sum_{i=1}^d \ln (1 + 2a_i t)} \,\mathrm{d} t
\le \int_0^\infty t\, \mathrm{e}^{- \frac12 H_d\ln(1 + 2t/H_d)} \,\mathrm{d} t. \tag{4}
$$
From (1) and (2) and (4), we have
$$1\le \mathbb{E}\left[\frac{1}{\|x\|^4}\right] \le \int_0^\infty t\, \mathrm{e}^{- \frac12 H_d\ln(1 + 2t/H_d)} \,\mathrm{d} t. $$
By the Dominated Convergence Theorem, we have
$$\lim_{d\to \infty} \int_0^\infty t\, \mathrm{e}^{- \frac12 H_d\ln(1 + 2t/H_d)} \,\mathrm{d} t = 1.$$
(Note: Note that $u \mapsto u \ln (1 + 2t/u)$
is non-decreasing on $u > 0$. Let $f_d(t) = t\, \mathrm{e}^{- \frac12 H_d\ln(1 + 2t/H_d)}$. Then $f_2(t) \ge f_3(t) \ge \cdots $, and $\lim_{d\to \infty} f_d(t) = t\mathrm{e}^{-t}$,
and $\int_0^\infty f_m(t)\,\mathrm{d} t < \infty$ for some $m$.
See: Monotone Convergence theorem for decreasing sequence)
Thus, we have
$$\lim_{d\to \infty} \mathbb{E}\left[\frac{1}{\|x\|^4}\right] = 1.$$
We are done.
Best Answer
So, the works of Bobkov et al. (1999), Otto et Villani (2000), and even more recently Otto et Reznikoff (2006), contain elements of solution to the questions I asked. I'll provide a quasi self-contained answer, hoping it will be helpful to a general public (like myself) without kungfu-panda-level knowledge on concentration inequalities.
N.B.: By "Riemannian manifold" I'll actually mean a smooth Riemannian manifold which is complete with the geodesic distance
$$d_p(x,y) := \inf_{\omega \in \mathcal C^1([0, 1], M) | \omega(0)=x,\omega(1) = y}\sqrt{\int_0^1\|\dot\omega(t)\|^2dt}. $$
And now, the real deal
Application
If $M = \mathbb R^d$ and $\mu = d\gamma_{c,\Sigma} \propto e^{-V(x)}dx$ where $V(x) = \frac{1}{2}(x-c)^T\Sigma^{-1}(x-c)$, for some p.s.d matrix $\Sigma$ with largest singular value $\sigma^2$ and vector $c \in \mathbb R^d$, then $\operatorname{Hess}(V) + \operatorname{Ricc}(M) \succeq (1/\sigma^2 + 0)I_d = (1/\sigma^2)I_d$. Thus each multivariate Gaussian $d\gamma_{c,\Sigma}$ distribution on $\mathbb R^d$ satisfies GIPIC($1/\sigma^2$), more precisely, it holds that $$ \gamma_{c,\Sigma}(B_\epsilon) \ge 1 - \exp\left(-\frac{1}{2\sigma^2}(\epsilon-\sigma\sqrt{2\log(1/\gamma_{c,\Sigma}(B))})^2\right),\;\forall \epsilon \ge \sigma\sqrt{2\log(1/\gamma_{c,\Sigma}(B))}. $$