I don't think there is a precise answer, especially without seeing the datasheet for the device. Usually PWM is used for efficiency, dimming, or over driving. Over driving is what you're interested in, this is where you pulse a device with too much current, but before the heat destroys it, you turn it off so it can cool down. But like I said, you need to see a data sheet to know what it can take. See this question about High Current Pulse on LED.
I have tested several large LED arrays before and I think they hit the sweet spot with regard to lumen vs power (measured with test equipment, not perceived brightness,) at a duty cycle around 85-90% at a frequency greater than 600hz and less than 8khz, but I cannot recall what it was.
there must be a sweet spot where our eyes cannot identify there is a
reduction in total power even though there is. For example if you Pwm
the laser with 10khz with 25% duty cycle this is as good as dc to our
eyes as we cannot see the on off cycle.
To clarify your thinking, as to what our eyes perceive as not flashing (Persistence of vision), anything over 16 flashes per second should be sufficient as long as it's not moving, hence films are 24 frames per second (at least they used to be.) However the eye perceiving it as solid light is a very different thing then not being able to notice brightness. When looking at a led pulsed with a frequency sufficiently rapid, the human eye can no longer resolve it as flashing, but it perceives the time average intensity; meaning you will notice a difference in the brightness. At some point the eye can't notice a change in brightness, and you will probably damage the eye, however that is if you are looking straight at it. The brighter the laser is, the farther and more noticeable it will be.
Quick answer - no, you can't make this work without some hardware changes.
I certainly hope you're not connecting your PICs to 15 volts. They're only rated for 3.6 volts.
Your MOSFET is not an IRL734. It might be an IRF734. In which case, I don't see how you're getting any output at all from the LEDs. What you want to do is connect the top of the LED chain to +12, and the bottom to a resistor. The other end of the resistor goes to the drain of the MOSFET, and the source goes to ground. If you know the operating current and voltage of the LEDs (and I hope you do), then the value of R is
R = (12 - (3 x Vf)) / i, where i is in amps.
Your driver would work (very briefly) if the PIC output were 12 volts, but it's not. I say briefly, because in very short order at least one of the LEDs would burn out. That's why you need a current limiting resistor.
Even with these changes, the circuit may not work well. The problem is that the PIC runs off 3.6 volts (max), while the threshold voltage for an IRF734 is two to four volts. And besides, the IRF734 is a 450 volt MOSFET, which is way overkill.
Given your obvious errors in circuit description (PIC voltage and MOSFET model), I suggest you go back to your source and provide a more complete (and accurate) description.
For what it's worth, 780 Hz is about 10 times faster than you need, but it ought to work just fine.
Best Answer
Since you appear to be concerned with flicker- If there is relative motion between the light source and your eye, the image may appear to "break up". The sensors in your retina act as if they integrate over a period of time in the milliseconds, but you can see a much more brief flash- it just appears dimmer for shorter durations below the flicker fusion frequency. For example, if you whack yourself over the head with a book you may be able to see a 1kHz flash break up, but the camera frame rate will at best do some aliasing thing with the shutter speed unless it is a high speed camera. This is not a coincidence, the camera frame rate is made just fast enough to appear steady, which is much slower than what can be picked up with persistence of vision. Flicker is also more perceptible in peripheral vision, 100Hz is annoying for many people- probably an evolutionary adaptation to help our ancestors pick up rapid motion of predators or prey out of the corners of our eyes.
If there's no motion the light may appear steady above some frequency in the tens of Hz, but it will vary a bit with duty cycle, I think, probably getting somewhat worse as the duty cycle gets shorter (due to the logarithmic response). Above the flicker fusion frequency perceived brightness is lower for lower duty cycle, so a 10% duty cycle source needs to be 10x brighter to be perceived the same as a light source at 100% duty cycle.