Java – Increasing screen capture speed when using Java and awt.Robot

awtrobotjavajava-2djavax.imageio

Edit: If anyone also has any other recommendations for increasing performance of screen capture please feel free to share as it might fully address my problem!

Hello Fellow Developers,

I'm working on some basic screen capture software for myself. As of right now I've got some proof of concept/tinkering code that uses java.awt.Robot to capture the screen as a BufferedImage. Then I do this capture for a specified amount of time and afterwards dump all of the pictures to disk. From my tests I'm getting about 17 frames per second.

Trial #1

Length: 15 seconds
Images Captured: 255

Trial #2

Length: 15 seconds
Images Captured: 229

Obviously this isn't nearly good enough for a real screen capture application. Especially since these capture were me just selecting some text in my IDE and nothing that was graphically intensive.

I have two classes right now a Main class and a "Monitor" class. The Monitor class contains the method for capturing the screen. My Main class has a loop based on time that calls the Monitor class and stores the BufferedImage it returns into an ArrayList of BufferedImages.
If I modify my main class to spawn several threads that each execute that loop and also collect information about the system time of when the image was captured could I increase performance? My idea is to use a shared data structure that will automatically sort the frames based on capture time as I insert them, instead of a single loop that inserts successive images into an arraylist.

Code:

Monitor

public class Monitor {

/**
 * Returns a BufferedImage
 * @return
 */
public BufferedImage captureScreen() {
    Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
    BufferedImage capture = null;

    try {
        capture = new Robot().createScreenCapture(screenRect);
    } catch (AWTException e) {
        e.printStackTrace();
    }

    return capture;
}
}

Main

public class Main {


public static void main(String[] args) throws InterruptedException {
    String outputLocation = "C:\\Users\\ewillis\\Pictures\\screenstreamer\\";
    String namingScheme = "image";
    String mediaFormat = "jpeg";
    DiscreteOutput output = DiscreteOutputFactory.createOutputObject(outputLocation, namingScheme, mediaFormat);

    ArrayList<BufferedImage> images = new ArrayList<BufferedImage>();
    Monitor m1 = new Monitor();

    long startTimeMillis = System.currentTimeMillis();
    long recordTimeMillis = 15000;

    while( (System.currentTimeMillis() - startTimeMillis) <= recordTimeMillis ) {
        images.add( m1.captureScreen() );
    }

    output.saveImages(images);

}
}

Best Answer

Re-using the screen rectangle and robot class instances will save you a little overhead. The real bottleneck is storing all your BufferedImage's into an array list.

I would first benchmark how fast your robot.createScreenCapture(screenRect); call is without any IO (no saving or storing the buffered image). This will give you an ideal throughput for the robot class.

long frameCount = 0;
while( (System.currentTimeMillis() - startTimeMillis) <= recordTimeMillis ) {
    image = m1.captureScreen();
    if(image !== null) {
        frameCount++;
    }
    try {
        Thread.yield();
    } catch (Exception ex) {
    }
}

If it turns out that captureScreen can reach the FPS you want there is no need to multi-thread robot instances.

Rather than having an array list of buffered images I'd have an array list of Futures from the AsynchronousFileChannel.write.

  • Capture loop
    • Get BufferedImage
    • Convert BufferedImage to byte array containing JPEG data
    • Create an async channel to the output file
    • Start a write and add the immediate return value (the future) to your ArrayList
  • Wait loop
    • Go through your ArrayList of Futures and make sure they all finished