How to render a SKNode to UIImage

This will capture the entire scene:

CGRect bounds = self.scene.view.bounds;
UIGraphicsBeginImageContextWithOptions(bounds.size, NO, [UIScreen mainScreen].scale);
[self drawViewHierarchyInRect:bounds afterScreenUpdates:YES];
UIImage* screenshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

If you want just a particular node branch you can hide all nodes you don't want to capture before taking the screenshot. You can also use accumulated frame converted to UIKit coordinates to capture only the area of the node and its children in question.

Alternately, you can get an SKTexture from a specific part of the node hierarchy:

SKTexture* tex = [self.scene.view textureFromNode:yourNode];

Prior to iOS 9, there wasn't way to convert an SKTexture back to a UIImage. Now, however, it's trivial:

UIImage *image = [UIImage imageWithCGImage:tex.CGImage];

To anyone looking at this thread 7 years later, I tried using the methods above but they yielded relatively slow results. I wanted to render an entire SKScene offscreen since I was rendering the images as frames for a video. This meant that I couldn't use the 1st method suggested by LearnCocos2D since it required the view to be drawn onscreen and the second method took a very long time to convert from SKTexture to a UIImage. You can use the new SKRenderer class introduced in iOS 11.0 to render the entire scene to a UIImage and it takes advantage of Metal so the renders happen relatively quickly. I was able to render a 1920x1080 SKScene in about 0.013 seconds!

You can use this extension:

  • Make sure you import MetalKit
  • The ignoreScreenScale parameter specifies whether the output image should be pixel accurate. Normally, if you will be displaying the image back on the screen you'll want this to be false. When this is false, the size of the output image is scaled by the device's scale such that each "point" on the scene occupies the same number of pixels in the image as it would on screen. When this is true, the size of the output image in pixels is equal to the size of the SKScene in points.

Cheers!

extension SKScene {
    func toImage(ignoreScreenScale: Bool = false) -> UIImage? {
        guard let device = MTLCreateSystemDefaultDevice(),
              let commandQueue = device.makeCommandQueue(),
              let commandBuffer = commandQueue.makeCommandBuffer() else { return nil }

        let scale = ignoreScreenScale ? 1 : UIScreen.main.scale
        let size = self.size.applying(CGAffineTransform(scaleX: scale, y: scale))
        let renderer = SKRenderer(device: device)
        let renderPassDescriptor = MTLRenderPassDescriptor()

        var r = CGFloat.zero, g = CGFloat.zero, b = CGFloat.zero, a = CGFloat.zero
        backgroundColor.getRed(&r, green: &g, blue: &b, alpha: &a)

        let textureDescriptor = MTLTextureDescriptor()
        textureDescriptor.usage = [.renderTarget, .shaderRead]
        textureDescriptor.width = Int(size.width)
        textureDescriptor.height = Int(size.height)
        let texture = device.makeTexture(descriptor: textureDescriptor)

        renderPassDescriptor.colorAttachments[0].loadAction = .clear
        renderPassDescriptor.colorAttachments[0].texture = texture
        renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
            red: Double(r),
            green: Double(g),
            blue: Double(b),
            alpha:Double(a)
        )

        renderer.scene = self
        renderer.render(withViewport: CGRect(origin: .zero, size: size), commandBuffer: commandBuffer, renderPassDescriptor: renderPassDescriptor)
        commandBuffer.commit()

        let image = CIImage(mtlTexture: texture!, options: nil)!
        let transformed = image.transformed(by: CGAffineTransform(scaleX: 1, y: -1).translatedBy(x: 0, y: -image.extent.size.height))
        return UIImage(ciImage: transformed)
    }
}