Low-mass M dwarfs are the only stars that are fully convective, but most stars have at least some convection going on either in the core or in the outer envelope.
Convection occurs because the temperature gradient exceeds the adiabatic temperature gradient and becomes susceptible to convective instabilities.
If a star has a temperature gradient exactly equal to the adiabatic temperature gradient, then a parcel of rising gas in pressure equilibrium with its surroundings will change its temperature in exactly the same way as its surroundings and nothing really happens. However, if the modulus of the (negative) temperature gradient of the surroundings is higher, then as the parcel rises it expands because it is hotter than the gas around it. This makes it more buoyant and it rises further, transporting heat outwards. This is a convective instability.
The key to your question is to examine the conditions under which the temperature gradient in a star becomes large enough to trigger convective instability. There are basically three cases where this happens.
The opacity of the gas to radiation becomes large. The temperature gradient then must become larger to carry the same energy flux. Roughly speaking
$$\frac{dT}{dr} \propto \kappa,$$
where $\kappa$ is the opacity of the gas.
The adiabatic temperature gradient could become smaller due to changes in the adiabatic index - for instance where ionisation state of the gas changes near the photosphere.
If the heat generation in the core of a star is very temperature sensitive then this induces a very steep temperature gradient. Main sequence stars more massive than the Sun generate energy through the CNO cycle, which is more temperature sensitive than the pp chain, and hence have convective cores.
In low-mass M-dwarfs it is mechanism (1) that is in operation. The opacity in a star is approximated by Kramer's opacity
$$\kappa \propto \rho T^{-7/2},$$
where $\rho$ is the density and $T$ the temperature.
M-dwarfs are denser than more massive stars and have lower interior temperatures. The opacity of the gas is then so high that convective instability is present throughout the star (except right at the photosphere). In higher mass main sequence stars, the opacity in the deep interior is low enough (because of higher temperatures and lower densities) to avoid convective instability. But convection then happens in the cooler outer layers (e.g. in the outer 20% or so of the Sun).
Why does the luminosity increase?
As core hydrogen burning proceeds, the number of mass units per particle in the core increases. i.e. 4 protons plus 4 electrons become 1 helium nucleus plus 2 electrons.
But pressure depends on both temperature and the number density of particles. If the number of mass units per particle is $\mu$, then
$$ P = \frac{\rho k_B T}{\mu m_u}, \ \ \ \ \ \ \ \ \ (1)$$
where $m_u$ is the atomic mass unit and $\rho$ is the mass density.
As hydrogen burning proceeds, $\mu$ increases from about 0.6 for the initial H/He mixture, towards 4/3 for a pure He core. Thus the pressure would fall unless $\rho T$ increases.
An increase in $\rho T$ naturally leads to an increase in the rate of nuclear fusion (which goes as something like $\rho^2 T^4$ in the Sun) and hence an increase in luminosity.
This is the crude argument used in most basic texts, but there is a better one.
The luminosity of a core burning star, whose energy output is transferred to the surface mainly via radiation (which is the case for the Sun, in which radiative transport dominates over the bulk of its mass) depends only on its mass and composition. It is easy to show, using the virial theorem for hydrostatic equilibrium and the relevant radiative transport equation (e.g. see p.105 of these lecture notes), that
$$ L \propto \frac{\mu^4}{\kappa}M^3,\ \ \ \ \ \ \ \ \ \ (2)$$
where $\kappa$ is the average opacity in the star.
Thus the luminosity of a radiative star does not depend on the energy generation mechanism at all. As $\mu$ increases (and $\kappa$ decreases because of the removal of free electrons) the luminosity must increase.
Why does the radius increase?
Explaining this is more difficult and ultimately does depend on the details of the nuclear fusion reactions. Hydrostatic equilibrium and the virial theorem tell us that the central temperature depends on mass, radius and composition as
$$T_c \propto \frac{\mu M}{R}$$
Thus for a fixed mass, as $\mu$ increases then the product $T_c R \propto \mu$ must also increase.
Using equation (2) we can see that if the nuclear generation rate and hence luminosity scales as $\rho^2 T_c^{\alpha}$, then if $\alpha$ is large, the central temperature can remain almost constant because a very small increase in $T_c$ can provide the increased luminosity. Hence if $RT_c$ increases in proportion to $\mu$ then $R$ must increase significantly. Thus massive main sequence stars, in which CNO cycle burning dominates and $\alpha>15$, experience a large change in radius during main sequence evolution. In contrast, for stars like the Sun, where H-burning via the pp-chain has $\alpha \sim 4$, the central temperature increases much more as $\mu$ and $\rho$ increase, and so the radius goes up but not by very much.
Best Answer
First, a star does not become a red giant when helium fusion begins, instead it becomes a red giant earlier when an inert degenerate core of helium forms and a shell of hydrogen begins fusion. When shell hydrogen fusion begins, the star expands to be a red giant.
The core is degenerate (sustained from collapse by electron degeneracy pressure) and therefore cannot cool by expansion, as explained here:
http://burro.astr.cwru.edu/Academics/Astr221/LifeCycle/redgiant.html
Later, helium fusion begins at which point the star is a horizonal branch star, rather than merely a red giant. At that point the core can cool by expansion, as explained in the reference.