The effect of including relativistic corrections in the dispersion relation for a low-density plasma consisting of moderately energetic electrons (energy ~2.5 keV) is investigated. Such a low-density plasma is presumed to exist at those altitudes on auroral zone field lines where auroral kilometric radiation is generated. Two different types of dispersion relation are employed for the purpose of studying the wave dispersion. The simpler of these is a ''ring'' distribution, in which both ambient and hot electrons are assumed to be cold (i.e., have no thermal spread associated with them). Using this dispersion relation, we find that for sufficiently low densities, the most unstable mode is a ''trapped'' mode that is decoupled from the freely propagating R-X mode found in a cold plasma. Since the dispersion relation neglects the effects of temperature, we also study wave dispersion using a spherically symmetric Dory-Guest-Harris (DGH) distribution. This ''shell'' distribution shows that provided the peak momentum of the DGH distribution is larger than the thermal spread, the most unstable mode at low wave vectors is still the trapped mode. Our analysis indicates that the trapped mode is usually the most unstable mode and the freely propagating R-X mode is not driven unstable, when the wave dispersion is assumed to be given by a weakly relativistic dispersion relation. We show that relativistic effects may be important for typical auroral electron distributions, and that there is insufficient evidence from particle data to justify the usual assumption that the cold plasma approximation is adequate when describing wave dispersion near the gyrofrequency on auroral zone field lines. |