Clock drivers low current consumption

buffercharge-pumpclockdrivermosfet-driver

I am trying to drive a 30 pF load cap (1 pF times 30 or 2pF times 15 or so on) from a ring oscillator generated clock. The frequency I want is 250 Mhz. What is the best approach to keep the input current as minimum as possible. Using more clock buffers or buffer tree draws more current while if I make a buffer by myself with cmos inverter then some spikes at edges get amplified and it screws up the current consumption at the final stage. Also less buffers results in incomplete charge and discharge of the load cap.

I pass a clock signal through some gate and then I see a small dent at the output (the reason is still unknown and it is not noise as it is periodic, i could not figure out yet) which when fed to inverters (drivers) naturally gets amplified. Now, How do I reduce the inverter gain and also if I reduce the gain then my charging of load cap gets affected, so is there a way to have minimum current consumption while charging and discharging the cap without screwing my final clock much? I know this is a stupid question but just thought if there is any solution that I am not aware of. I want to keep my input current below 500uA and the design is in 350nm technology. The load caps are for charge pump.

Best Answer

Charging and discharging a capacitance C at a frequency f is equivalent to having an effective load resistance \$R = \frac{1}{Cf}\$. (This relationship is the basis of switched-capacitor filters.) Using 30 pF and 250 MHz, this works out to 133.3 Ω There's no getting around this relationship — if you need to limit your current to 500 µA, then you need to limit the voltage swing to 133.3 Ω × 500 µA = 66 mVP-P.