Electrical – Circuit to increase current of arbitrary waveform generator

amplifierfunction generatormicrowavewaveform

I am trying to create a setup for some Physics experiments – so my background is Physics, not Electrical Engineer that can do circuit design. But I can put a circuit together. I need to amplify the current of a signal from an arbitrary waveform generator. I also need to make sure that any amplification circuit I attach to the waveform generator cannot damage the waveform generator.

The current hardware I have is a Tektronix AWG610 Arbitrary waveform generator and a Tekpower TP3005T Variable Linear DC Power Supply, A constant current or constant voltage power supply it can go up to 5amps. The waveform generator is capable of 400 fs Edge Timing, which I need to preserve as much as possible on the amplified side.

So I am looking for a circuit diagram that would be able to use the signal from the AWG610 and use it to basically modulate the current from the power supply or alternatively basically amplify the current of the AWG610 generated wave form using the separate constant current power source. But If possible I would like the amplifying circuit to be a little more flexible and be able to handle higher amps for when the power supply is upgraded, so maybe 100amp in future. I also need the amplifier to be able to support from low frequency up to at least 1Ghz.

The waveform maybe arbitrary, might be just pulses or a really noisy erratic waveform.

My budget is around $300.

Thanks, any help is much appreciated.

Best Answer

Let's try to put this in perspective.

For the moment, let's ignore the amplifier itself, and assume you already have it. Let's just look at what it takes to get your 5 V, 100 Amp, 1 GHz signal from the output of your amplifier to a load. For the sake of argument, let's assume this in a lab, where you can arrange things conveniently, so your load is, say, 4 inches away from the output terminals of your amplifier.

To carry a 100 amp load, you normally want at least 4 AWG copper wire (and, depending on distance, 3 AWG may be preferred)1.

As noted above, we're assuming you need to transmit your signal for 4 inches. A quick check shows that 4 inches of 4 AWG copper (at 1 GHz) will have about 72 nH of inductance.

If you plan to drive 100 amps with only 5 volts, the input impedance of you load can be a maximum of R = 5/100 = 0.05 ohms. So, your effective circuit looks like this:

schematic

simulate this circuit – Schematic created using CircuitLab

Now, let's think about that circuit for a second. To (even an amateur) EE, that looks a lot like a low-pass filter. A quick run through a calculator shows that viewing it as a filter, those values give a cut-off frequency of about 2.2 MHz. By 1 GHz, it has something like 50-55 dB of attenuation. So, your 5 volts at 1 GHz coming out of the amplifier is down to about 10 millivolts at 1 GHz by the time it gets to the load.

Bottom line: with the kind of specs you're talking about (100 amps at 1 GHz), just getting the electricity from the amp to the load successfully becomes quite a non-trivial undertaking (and, of course, at a greater distance, the inductance increases, and with it the impedance).


1. Note: those are based on 50-60 Hz power transmission though. Due to skin effect, at 1 GHz you'd probably need something even larger (or something of that effective size, but made of of many fine strands).

Related Topic