You have 3 times as many LEDs than resistors, so it makes sense that they are arranged in many series strings of 3.
Assuming they are all running at the same current, then the 200Ω resistor strings will have the highest Vf. If we assume typical 20mA-30mA maximum LEDs, running at 10mA or so, and we have a blue or white colour which has around 3.2V drop at this current, then:
(3.2V * 3) + (200Ω * 10mA) = 11.6V
If it's 12mA then we have
(3.2V * 3) + (200Ω * 12mA) = 12V
For the 470&Omega and red LEDs:
(2.1V * 3) + (470Ω * 12mA) = ~11.94V
So 12V looks like a reasonable bet.
According to this link, The Vf of the same colour can vary considerably between manufacturers for the same colour (it shows 6 white LEDs driven at 3.4V vary from 10mA to 44mA) so until you test yours you can only guess. 12V is a commonly available supply voltage though, so this looks plausible.
I would start ramping up the voltage slowly (ideally with a bench supply) whilst monitoring the voltage across one (or a few - one of each value) of the resistors using a multimeter (e.g. notch up a bit, test, etc).
Using Ohm's law as above, head for 10 - 15mA (so e.g. for the 470Ω and 10mA you are looking for a 470 * 10mA = 4.7V drop) and see what supply voltage you are at when you reach this level. Then you can make a good guess at the original voltage.
Here's a table of Vf for various colours (from dangerous prototypes):

78L05
Another reason to assume not much higher than 12V is the drop for the 78L05. According to the datasheet, it has a thermal junction to ambient (θja) resistance of 150°C/W. So if it were supplying 50mA (it can go up to 100mA) and the supply were something like 20V, then:
(20V - 5V) * 50mA = 0.75W
150°C/W * 0.75W = 112.5 °C rise above ambient.
It's absolute maximum operating temperature is 150 so it would be very hot and have little ambient operating range. For 12V is more reasonable:
(12V - 5V) * 50mA = 0.35W
150°C/W * 0.35W = 52.5 °C rise above ambient. Much better.
Best Answer
There are two basic problems with your circuit.
You can solve this problem by using a voltage divider which raises the DC Base voltage up close to 0.6V.
This effect can be reduced by providing a path to ground for the capacitor to discharge into, and adding a resistor in series with the Base to reduce the peak current.
The following circuit has been found to work reasonably well:-
simulate this circuit – Schematic created using CircuitLab
R3 and R4 create a voltage divider with a ratio of 5.7:1, dropping 3.3V down to 0.58V. This is enough to just barely turn the transistor on, so the LED will glow weakly with no signal. Any signal will now cause the transistor to turn on harder and make the LED glow brighter.
R2 reduces Base current during positive peaks of the signal waveform, reducing the amount of charge stored in the audio source's coupling capacitor. R4 provides a discharge path for this charge during the rest of the waveform. As R4 is much lower than R2, the capacitor will only build up a small bias voltage that doesn't affect the LED brightness much.
Note that even with these improvements this circuit will never produce maximum possible brightness, because it only turns the LED on during positive peaks of the audio waveform.