This quickly got longer than I thought it would be, and I'm sure I inadvertently skipped something. If you have any questions, feel free to ask.
Definition: (Lie Algebra) A Lie algebra is a real vector space $V$ equipped with an operation $$[\cdot,\cdot]:V\times V\to V,$$ called the bracket of the Lie algebra, such that
- $[\cdot,\cdot]$ is bilinear. That is, for $u,v,w\in V$ and $r\in\mathbb{R}$, $[u+v,w]=[u,w]+[v,w]$, $[u,v+w]=[u,v]+[u,w]$, and $[ru,v]=[u,rv]=r[u,v]$.
- $[\cdot,\cdot]$ is alternating. That is, for $v\in V$, $[v,v]=0$.
- $[\cdot,\cdot]$ satisfies the Jacobi identity. That is, for $u,v,w\in V$, $$[u,[v,w]]+[v,[w,u]]+[w,[u,v]]=0.$$
This is an abstract definition of a Lie algebra. As we can see, a Lie algebra is a vector space by definition.$^*$ But, what about the Lie algebra associated to a Lie group?
The product in the Lie algebra of a Lie group can be defined by using the typical Lie bracket $[X,Y]=XY-YX$ on vector fields, where we are treating vector fields as derivations mapping smooth functions to smooth functions. We simply extend $X_\mathrm{e}\in T_\mathrm{e}G$ to the vector field $X:g\mapsto X_g=(g)^L_*X_\mathrm{e}$, where $(g)^L_*$ is the pushforward of left multiplying by $g\in G$ (Acting on the left or right doesn't actually matter). Then, we can simply define $[X_\mathrm{e},Y_\mathrm{e}]=[X,Y]_\mathrm{e}$.
Proposition 1: The tangent space $T_\mathrm{e}G$ to the identity $\mathrm{e}$ of the Lie group $G$, equipped with the bracket operation defined in the previous paragraph, is a Lie algebra.
For the sake of helping you practice (Practice is the only way to truly understand Lie theory), I will leave this proof as an exercise. It is just a matter of checking the axioms. If you have trouble, I will be happy to help you below, in the comments.
There are several definitions of the exponential map. However, my primary choice when I go to prove something is the one I consider most useful. To give that definition, we need one more bit of terminology.
Definition: (1-parameter Subgroup) A 1-parameter subgroup on a Lie group $G$ is a Lie group homomorphism (a smooth homomorphism between Lie groups) between $\mathbb{R}$ and $G$.
Note, first and foremost, that this is NOT a subgroup. It is a homomorphism. Sometimes, it is useful to think of these as copies of $\mathbb{R}$ in $G$, and that is why the term comes up, but they are not themselves subgroups.
The set of all 1-parameter subgroups, on the other hand, does form a vector space. In fact, this vector space is isomorphic to $T_\mathrm{e}G$. The proof I know of this is rather tedious, but the intuition is pretty straightforward: there is a bijection between the tangent vectors (thinking as we did in high school of "magnitude and direction") in $T_\mathrm{e}G$ and the paths through $\mathrm{e}\in G$ with particular velocity ("magnitude and direction") at $\mathrm{e}$. With this intuition in hand, we define our exponential map.
Definition: (Exponential Map) Let $\mathfrak{g}$ be the Lie algebra of a Lie group $G$. Then, the exponential map $\exp:\mathfrak{g}\to G$ maps $X_\mathrm{e}\in\mathfrak{g}$ to $\theta(1)\in G$, where $\theta:\mathbb{R}\to G$ is the unique 1-parameter subgroup such that $\theta_*(\left.\frac{\partial}{\partial t}\right|_0)=X_\mathrm{e}$. (That is, $\theta$ is the unique 1-parameter subgroup of $G$ such that its tangent vector at $\mathrm{e}$ is $X_\mathrm{e}$.)
Thus, by definition, $\exp$ maps into $G$. But, what about the matrix exponential?
Proposition 2: The map $$\exp:\mathfrak{gl}_n(\mathbb{R})\to GL_n(\mathbb{R}),~X\mapsto\sum_{k=0}^\infty\frac{1}{k!}X^k$$ satisfies the definition of the exponential map.
Proof: Consider the map $\gamma_X:\mathbb{R}\to GL_n(\mathbb{R}),~t\mapsto\exp(tX)$, where $X\in\mathfrak{gl}_n(\mathbb{R})$ (so, $X$ can be any real $n\times n$ matrix). Note that $\gamma_X$ does, indeed, map into $GL_n(\mathbb{R})$, since $\exp(tX)^{-1}=\exp(-tX)$, so $\gamma_X(t)$ is invertible. Clearly, $\gamma_X$ is smooth, and $(\gamma_X)_*(\left.\frac{\partial}{\partial t}\right|_0)=X$. It remains to show that $\gamma_X(t+s)=\gamma_X(t)\gamma_X(s)$, to show that $\gamma_X$ is a homomorphism. But, a classic property of the matrix exponential is that $\exp((t+s)X)=\exp(tX)\exp(sX)$, so we are done. Thus, $\gamma_X$ is a 1-parameter subgroup, and $\gamma_X(1)=\exp(X)$ is the image of $X$ under the exponential map. Tombstone.
Thus, the matrix exponential is the exponential map of $GL_n(\mathbb{R})$ (and $GL_n(\mathbb{C})$, but I am sticking to reals for this explanation). Extending this, we get that the matrix exponential is the exponential map for all matrix Lie groups (matrix Lie groups are Lie subgroups of $GL_n(\mathbb{R})$ for some $n$).
Now, rather than prove to you that the matrix Lie algebra of each matrix Lie group is isomorphic to the abstract Lie algebra of that Lie group, I will show how we use the latter idea to pseudo-rigorously obtain the former, as I think this is more instructive. As before, if more rigor is requested, more will be given.
When I think about Lie algebras of Lie groups, I think of the elements of the Lie algebra as tangent vectors, which leads to the thought of "infinitesimal" group elements. For example, given the orthogonal group $O_n(\mathbb{R})$, we can think of its Lie algebra elements as imperceptibly small rotations. In other words, $X\in\mathfrak{o}_n(\mathbb{R})$ implies that, for some $\varepsilon > 0$ such that $\varepsilon^2=0$ (note that such an $\varepsilon$ doesn't exist), $(I+\varepsilon X)\,``\in"O_n(\mathbb{R})$, where $I$ is the identity matrix. But, since $A\in O_n(\mathbb{R})$ if and only if $AA^T=I$, $X\in\mathfrak{o}_n(\mathbb{R})$ "if and only if" $$(I+\varepsilon X)(I+\varepsilon X)^T=I+\varepsilon(X+X^T)+\varepsilon^2XX^T=I+\varepsilon(X+X^T)=I,$$ so $X+X^T=0$, which is the definition for $\mathfrak{o}_n(\mathbb{R})$. Thus, thinking of Lie algebra elements as tangent vectors gives us the matrix Lie algebra interpretation.
Since you have a reference-request tag, I will suggest J. Frank Adams's Lectures on Lie Groups, which provides an easily comprehendible introduction to abstract Lie groups. Also, Chapter 10 of Spivak's A Comprehensive Introduction to Differential Geometry, Volume 1 (I use the 3rd edition) is, as intended, fairly comprehensive.
$^*$For you fans of category theory, I mean that there is clearly a forgetful functor from each Lie algebra to its underlying vector space.
I assume you are trying to show that the two natural brackets on $\mathfrak g$ - one Lie-theoretic and one via commutators of matrices - agree. As Charlie Frohman points out in the comments above, the "set of derivatives of all paths on $G$" is not the right object to think about; this is the collection of all $AX$, where $A \in G$ and $X \in \mathfrak g$. This is much larger and not a linear space unless you pass to the closure under span, and that has no natural/interesting Lie bracket. What you want is the set of derivatives $\gamma'(0)$ of curves with $\gamma(0) = e$.
Starting from your second bullet point, recall that a homomorphism $f: G \to H$ of Lie groups induces a homomorphism $df_e: \mathfrak g \to \mathfrak h$ of Lie algebras. In particular, take $\rho: G \to GL_n$ to be your defining faithful representation (that is, $\rho$ is injective); then $d\rho_e: \mathfrak g \to \mathfrak{gl}_n$ gives the isomorphism of $\mathfrak g$ with its image inside the space of matrices; its image is what you call $G'$.
In particular, because $d\rho_e$ is injective, it gives an isomorphism of Lie algebras between $\mathfrak g$ and your $G'$, the latter equipped with the bracket on $\mathfrak gl_n$. But you already know that the bracket on $\mathfrak gl_n$ is the usual commutator of matrices, so this gives an isomorphism between $\mathfrak g$ with the left invariant vector field bracket and $G'$ equipped with the matrix-commutator bracket. This is what you wanted.
(Because the Lie-theoretic exponential map is also natural under homomorphisms, one also sees immediately that $\text{exp}(G') \subset \rho(G)$ from this, where because the Lie-theoretic exponential map on $\mathfrak{gl}_n$ is $X \mapsto \sum_{n \geq 0} \frac{X^n}{n!}$, the same is true for the subspace $G'$.)
For self-containedness, here is a proof that the Lie bracket on $\mathfrak{gl}(n)$ is the matrix commutator.
Given any Lie group $G$, there is a map $G \to GL(T_e G)$, given by taking the derivative at $e$ of the conjugation action of $G$ on itself. Taking the derivative of this map we obtain a Lie algebra map $\mathfrak g \mapsto \mathfrak{gl}(\mathfrak g)$; this is the map $X \mapsto [X, -]$, the Lie algebra commutator.
For $G = GL_n$, let $x(t) = I + tX$ for some vector $X$, and similarly $y(s) = I + sY$. These live in $GL_n$ for small $t$, and $x^{-1}(t) = I - tX + O(t^2)$. Then $x(t) \cdot y(s) \cdot x^{-1}(t) = I + sY + tsXY - tsYX + O(s^2, t^2)$. Thus taking the derivative of the map $G \to \text{Aut}(G)$ given by conjugation obtains one the map $G \to \text{Aut}(\mathfrak g)$ given by sending $I+tX$ to the operator $(I+tX)(Y) = Y + t[X, Y] + O(t^2)$, where the commutator is the Lie group commutator. Now taking the derivative as $t \to 0$ we obtain that the map $\mathfrak gl_n \to \text{End}(\mathfrak gl_n)$ is given by $X \mapsto [X, -]$.
Best Answer
If $k$ is any field, the $k$-algebra $M_n(k)$ is Morita equivalent (as a $k$-algebra) to $k$. It follows that $M_n(k)$ and $k$ have isomorphic Hochschild homologies. In particular, they have isomorphic $0$th Hochschild homology.
In general, if $A$ is a $k$-algebra, the zeroth homology is $HH_0(A)=A/[A,A]$, the quotient of $A$ by the subspace generated by commutators. Since $k$ is a commutative $k$-algebra, it is obvious that $HH_0(k)=k$. The first paragraph, then, tells us that $$M_n(k)/[M_n(k),M_n(k)]\cong k.$$
Now, the trace is a non-zero linear map $\mathrm{tr}:M_n(k)\to k$ which vanishes on $[M_n(k),M_n(k)]$. It follows by the above isomorphism that $[M_n(k),M_n(k)]$ is precisely the kernel of the trace.
N.B.: The details underlying my first paragraph above are explain in Jean-Louis Loday's book on cyclic homology or in Chuck Weibel's one on homological algebra, among other places.
Later. Olivier wanted matrices to be actual commutators, not sums thereof. It is a result of Albert and Muckenhoupt that this is always possible over a field. See http://projecteuclid.org/euclid.mmj/1028990168