[Physics] Why do only red dwarf stars have convective currents

fusionstarsstellar-evolution

Stellar models indicate that red dwarfs less than 0.35 M☉ are fully convective.[3] Hence the helium produced by the thermonuclear fusion of hydrogen is constantly remixed throughout the star, avoiding its buildup at the core and prolonging the period of fusion. Red dwarfs therefore develop very slowly, maintaining a constant luminosity and spectral type for trillions of years, until their fuel is depleted. Because of the comparatively short age of the universe, no red dwarfs exist at advanced stages of evolution. – Wikipedia

Red dwarf stars are tiny. Is this why they can have convection currents?

Red Dwarf Gliese 623b:

enter image description here

Best Answer

Low-mass M dwarfs are the only stars that are fully convective, but most stars have at least some convection going on either in the core or in the outer envelope.

Convection occurs because the temperature gradient exceeds the adiabatic temperature gradient and becomes susceptible to convective instabilities.

If a star has a temperature gradient exactly equal to the adiabatic temperature gradient, then a parcel of rising gas in pressure equilibrium with its surroundings will change its temperature in exactly the same way as its surroundings and nothing really happens. However, if the modulus of the (negative) temperature gradient of the surroundings is higher, then as the parcel rises it expands because it is hotter than the gas around it. This makes it more buoyant and it rises further, transporting heat outwards. This is a convective instability.

The key to your question is to examine the conditions under which the temperature gradient in a star becomes large enough to trigger convective instability. There are basically three cases where this happens.

  1. The opacity of the gas to radiation becomes large. The temperature gradient then must become larger to carry the same energy flux. Roughly speaking $$\frac{dT}{dr} \propto \kappa,$$ where $\kappa$ is the opacity of the gas.

  2. The adiabatic temperature gradient could become smaller due to changes in the adiabatic index - for instance where ionisation state of the gas changes near the photosphere.

  3. If the heat generation in the core of a star is very temperature sensitive then this induces a very steep temperature gradient. Main sequence stars more massive than the Sun generate energy through the CNO cycle, which is more temperature sensitive than the pp chain, and hence have convective cores.

In low-mass M-dwarfs it is mechanism (1) that is in operation. The opacity in a star is approximated by Kramer's opacity $$\kappa \propto \rho T^{-7/2},$$ where $\rho$ is the density and $T$ the temperature.

M-dwarfs are denser than more massive stars and have lower interior temperatures. The opacity of the gas is then so high that convective instability is present throughout the star (except right at the photosphere). In higher mass main sequence stars, the opacity in the deep interior is low enough (because of higher temperatures and lower densities) to avoid convective instability. But convection then happens in the cooler outer layers (e.g. in the outer 20% or so of the Sun).

Related Question