Typical radio frequency style millimeter-wave detectors for passive millimeter-wave imaging are limited by the thermal noise floor inherent in microwave and millimeter-wave circuits. This thermal noise, or Johnson noise, can be derived from Rayleigh-Jeans approximation of Planck's blackbody radiation law and is a fundamental limit to the noise levels in such systems. For this reason low-noise amplifiers are typically used at the front end of such high sensitivity receivers to improve their noise equivalent difference temperature. However, such approaches are undesirable in high-noise environments where high signal levels could damage such amplifiers or in large pixel count arrays where the cost of amplification of each pixel becomes prohibitive. Herein, we present an alternate approach to achieving low difference temperatures, which involves upconversion to optical frequencies with subsequent filtering and square-law detection. Through this process the normal thermal noise floor limitations are not relevant as, at such high frequencies, the Rayleigh-Jean’s approximation is no longer valid. In fact, the thermal noise present in the optical regime is diminished to negligible levels at room temperatures and quantum noise becomes the fundamental limit to system performance. In this paper, we present the theoretical and practical limits to reduction of noise equivalent power using optical amplifiers and relate the noise contributions of such amplifiers to the quantum noise limit.