I have a 96volt DC line, and I am trying to get it down to 3.3volts to put into an opamp, and then to put into an ADC so that I can read the voltage on the line. The 96volt line will be prone to spikes when it is turning on and off.

I am thinking of using a voltage divider to get it down to between 0 and 5 volts. The line can go up to 110 volts for a long period of time, so I chose R1 to be 11k and R2 to be 415 ohms. Any thoughts of problems with this design?

I also need to figure out how to prevent voltage spikes (up to 220v) for a short period of time. I am thinking about using a capacitor connected to ground coming off of the voltage divider line. I am worried that this may decrease the accuracy of measuring the voltage.
Would this be the correct way to go, or is there a beter solution?

Thank you!

What you have will work to attenutate 110 V to 4.0 V. However, there will be considerable power dissipation. R11 will dissipate about 1 W at 110 V in, so you definitely can't use a ordinary 0805 resistor. If you really need the low output impedance of 400 Ω, then get a resistor that can handle the power.

If you can do with a higher output impedance, then you could use larger resistors. For example, 100 kΩ at top and 3.77 kΩ at bottom will dissipate less than 120 mW total with 110 V in. The output impedance in that case will be 3.63 kΩ, which is still low enough to go directly into many microcontrollers.

The capacitor as you show will filter short term spikes. No, it won't hurt your accuracy at all. It may actually enhance it due to reducing high frequency noise. Make the low pass filter rolloff frequency as low as possible without cutting into the signal you actually want. For example, with 100 kΩ and 3.77 kΩ resistors, a 1 µF cap will make a LPF with a rolloff of about 44 Hz. If your valid signal is only up to 20 Hz or so, that would work fine.