This paper presents a design methodology for a G<sub>m</sub>C filter in a Continuous-Time (CT) Sigma-Delta A/D converter. It focuses on the challenges the designer faces when a deep sub-micron technology is used. According to the proposed methodology, a 1-bit, 3<sup>rd</sup> order CT modulator is designed. The modulator achieves an accuracy of 10 bits within a signal band of 8 MHz. The design is made in a 90 nm standard CMOS process. The small transistor dimensions enable a clock rate of 1 GHz. An analytical comparison between RC filters and G<sub>m</sub>C filters is presented based on their power consumption. It is shown that a RC filter requires an integrator loop gain-bandwidth equal to the sampling rate. This puts a severe limitation on the minimal power consumption for this type of filter. Therefore, a G<sub>m</sub>C filter implementation is chosen because it consumes the lowest power in order to meet the design specifications. Mathematical expressions for harmonic distortion and thermal noise are derived. They are interpreted in terms of a low power design approach. Since the input signal swing scales down with the supply voltage, harmonic distortion becomes less important in a deep sub-micron technology. Therefore, the thermal noise requirements determine mainly the overall power consumption of the CT modulator. Small transistor lengths enable high sampling rates, but lower the integrator
output impedance. This results in a reduced DC gain of the filter. Consequently, the proposed G<sub>m</sub>C filter architecture is adjusted to provide sufficient suppression of in-band quantization noise leakage. All proposed design choices are verified by numerical simulations.