We analyze the classic Hanbury Brown-Twiss effect for thermal electromagnetic fields in space-frequency domain.
We compare two different approaches and show that the normalized correlation of intensity fluctuations is fully
characterized by the spectral electromagnetic degree of coherence, a result analogous to scalar analysis of the
effect. Differences between the two approaches are discussed.