Awhile back I posted a handful of simple iOS utilities. Among them was a basic ScreenCaptureView implementation that would periodically render the contents of its subview(s) into a UIImage that was exposed as a publicly accessible property. This provides the ability to quickly and easily take a snapshot of your running application, or any arbitrary component within it. And while not superbly impressive (the iPhone has a built-in screenshot feature, after all), I noted that the control theoretically allowed for captured frames to be sent off to an AVCaptureSession in order to record live video of a running application.
Recently I returned to this bit of code, and the ability to record live video of an application is theoretical no longer. To get straight to the point, here is the revised code:
//
//ScreenCaptureView.h
//
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
/**
* Delegate protocol. Implement this if you want to receive a notification when the
* view completes a recording.
*
* When a recording is completed, the ScreenCaptureView will notify the delegate, passing
* it the path to the created recording file if the recording was successful, or a value
* of nil if the recording failed/could not be saved.
*/
@protocol ScreenCaptureViewDelegate <NSObject>
- (void) recordingFinished:(NSString*)outputPathOrNil;
@end
/**
* ScreenCaptureView, a UIView subclass that periodically samples its current display
* and stores it as a UIImage available through the 'currentScreen' property. The
* sample/update rate can be configured (within reason) by setting the 'frameRate'
* property.
*
* This class can also be used to record real-time video of its subviews, using the
* 'startRecording' and 'stopRecording' methods. A new recording will overwrite any
* previously made recording file, so if you want to create multiple recordings per
* session (or across multiple sessions) then it is your responsibility to copy/back-up
* the recording output file after each session.
*
* To use this class, you must link against the following frameworks:
*
* - AssetsLibrary
* - AVFoundation
* - CoreGraphics
* - CoreMedia
* - CoreVideo
* - QuartzCore
*
*/
@interface ScreenCaptureView : UIView {
//video writing
AVAssetWriter *videoWriter;
AVAssetWriterInput *videoWriterInput;
AVAssetWriterInputPixelBufferAdaptor *avAdaptor;
//recording state
BOOL _recording;
NSDate* startedAt;
void* bitmapData;
}
//for recording video
- (bool) startRecording;
- (void) stopRecording;
//for accessing the current screen and adjusting the capture rate, etc.
@property(retain) UIImage* currentScreen;
@property(assign) float frameRate;
@property(nonatomic, assign) id<ScreenCaptureViewDelegate> delegate;
@end
//
//ScreenCaptureView.m
//
#import "ScreenCaptureView.h"
#import <QuartzCore/QuartzCore.h>
#import <MobileCoreServices/UTCoreTypes.h>
#import <AssetsLibrary/AssetsLibrary.h>
@interface ScreenCaptureView(Private)
- (void) writeVideoFrameAtTime:(CMTime)time;
@end
@implementation ScreenCaptureView
@synthesize currentScreen, frameRate, delegate;
- (void) initialize {
// Initialization code
self.clearsContextBeforeDrawing = YES;
self.currentScreen = nil;
self.frameRate = 10.0f; //10 frames per seconds
_recording = false;
videoWriter = nil;
videoWriterInput = nil;
avAdaptor = nil;
startedAt = nil;
bitmapData = NULL;
}
- (id) initWithCoder:(NSCoder *)aDecoder {
self = [super initWithCoder:aDecoder];
if (self) {
[self initialize];
}
return self;
}
- (id) init {
self = [super init];
if (self) {
[self initialize];
}
return self;
}
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
[self initialize];
}
return self;
}
- (CGContextRef) createBitmapContextOfSize:(CGSize) size {
CGContextRef context = NULL;
CGColorSpaceRef colorSpace;
int bitmapByteCount;
int bitmapBytesPerRow;
bitmapBytesPerRow = (size.width * 4);
bitmapByteCount = (bitmapBytesPerRow * size.height);
colorSpace = CGColorSpaceCreateDeviceRGB();
if (bitmapData != NULL) {
free(bitmapData);
}
bitmapData = malloc( bitmapByteCount );
if (bitmapData == NULL) {
fprintf (stderr, "Memory not allocated!");
return NULL;
}
context = CGBitmapContextCreate (bitmapData,
size.width,
size.height,
8, // bits per component
bitmapBytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextSetAllowsAntialiasing(context,NO);
if (context== NULL) {
free (bitmapData);
fprintf (stderr, "Context not created!");
return NULL;
}
CGColorSpaceRelease( colorSpace );
return context;
}
//static int frameCount = 0; //debugging
- (void) drawRect:(CGRect)rect {
NSDate* start = [NSDate date];
CGContextRef context = [self createBitmapContextOfSize:self.frame.size];
//not sure why this is necessary...image renders upside-down and mirrored
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
CGContextConcatCTM(context, flipVertical);
[self.layer renderInContext:context];
CGImageRef cgImage = CGBitmapContextCreateImage(context);
UIImage* background = [UIImage imageWithCGImage: cgImage];
CGImageRelease(cgImage);
self.currentScreen = background;
//debugging
//if (frameCount < 40) {
// NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
// NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
// [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES];
// frameCount++;
//}
//NOTE: to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls
// 'setNeedsDisplay' on the ScreenCaptureView.
if (_recording) {
float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0;
[self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)];
}
float processingSeconds = [[NSDate date] timeIntervalSinceDate:start];
float delayRemaining = (1.0 / self.frameRate) - processingSeconds;
CGContextRelease(context);
//redraw at the specified framerate
[self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01];
}
- (void) cleanupWriter {
[avAdaptor release];
avAdaptor = nil;
[videoWriterInput release];
videoWriterInput = nil;
[videoWriter release];
videoWriter = nil;
[startedAt release];
startedAt = nil;
if (bitmapData != NULL) {
free(bitmapData);
bitmapData = NULL;
}
}
- (void)dealloc {
[self cleanupWriter];
[super dealloc];
}
- (NSURL*) tempFileURL {
NSString* outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
NSURL* outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager* fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
NSError* error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO) {
NSLog(@"Could not delete old recording file at path: %@", outputPath);
}
}
[outputPath release];
return [outputURL autorelease];
}
-(BOOL) setUpWriter {
NSError* error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:[self tempFileURL] fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(videoWriter);
//Configure video
NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:1024.0*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:self.frame.size.width], AVVideoWidthKey,
[NSNumber numberWithInt:self.frame.size.height], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];
NSParameterAssert(videoWriterInput);
videoWriterInput.expectsMediaDataInRealTime = YES;
NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
avAdaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:bufferAttributes] retain];
//add input
[videoWriter addInput:videoWriterInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];
return YES;
}
- (void) completeRecordingSession {
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
[videoWriterInput markAsFinished];
// Wait for the video
int status = videoWriter.status;
while (status == AVAssetWriterStatusUnknown) {
NSLog(@"Waiting...");
[NSThread sleepForTimeInterval:0.5f];
status = videoWriter.status;
}
@synchronized(self) {
BOOL success = [videoWriter finishWriting];
if (!success) {
NSLog(@"finishWriting returned NO");
}
[self cleanupWriter];
id delegateObj = self.delegate;
NSString *outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSLog(@"Completed recording, file is stored at: %@", outputURL);
if ([delegateObj respondsToSelector:@selector(recordingFinished:)]) {
[delegateObj performSelectorOnMainThread:@selector(recordingFinished:) withObject:(success ? outputURL : nil) waitUntilDone:YES];
}
[outputPath release];
[outputURL release];
}
[pool drain];
}
- (bool) startRecording {
bool result = NO;
@synchronized(self) {
if (! _recording) {
result = [self setUpWriter];
startedAt = [[NSDate date] retain];
_recording = true;
}
}
return result;
}
- (void) stopRecording {
@synchronized(self) {
if (_recording) {
_recording = false;
[self completeRecordingSession];
}
}
}
-(void) writeVideoFrameAtTime:(CMTime)time {
if (![videoWriterInput isReadyForMoreMediaData]) {
NSLog(@"Not ready for video data");
}
else {
@synchronized (self) {
UIImage* newFrame = [self.currentScreen retain];
CVPixelBufferRef pixelBuffer = NULL;
CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
if(status != 0){
//could not get a buffer from the pool
NSLog(@"Error creating pixel buffer: status=%d", status);
}
// set image data into pixel buffer
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels); //XXX: will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data
if(status == 0){
BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
if (!success)
NSLog(@"Warning: Unable to write buffer to video");
}
//clean up
[newFrame release];
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
CVPixelBufferRelease( pixelBuffer );
CFRelease(image);
CGImageRelease(cgImage);
}
}
}
@end
This class will let you record high-quality video of any other view in your application. To use it, simply set it up as the superview of the UIView(s) that you want to record, add a reference to it in your corresponding UIViewController (using Interface Builder or whatever your preferred method happens to be), and then call ‘startRecording‘ when you are ready to start recording video. When you’ve recorded enough, call ‘stopRecording‘ to complete the process. You will get a nice .mp4 file stored under your application’s ‘Documents’ directory that you can copy off or do whatever else you want with.
Note that if you want to record a UIScrollView while it is scrolling, you will need to implement your UIScrollViewDelegate such that it calls ‘setNeedsDisplay‘ on the ScreenCaptureView while the scroll-view is scrolling. For instance:
- (void) scrollViewDidScroll: (UIScrollView*)scrollView {
[captureView setNeedsDisplay];
}
I haven’t tested this code on a physical device yet, but there’s no reason why it should not work on any device that includes H.264 video codec support (iPhone 3GS and later). However, given the amount of drawing that it does, it’s safe to say that the more horsepower behind it, the better.
Here is a rather unimpressive 30-second recording of a UITableView that I created using this class (if your browser doesn’t support HTML5, use the link below):
Example iPhone Recording
Lastly, I haven’t tested this class with any OpenGL-based subviews, so I can’t say if it will work in that case. If you try it in this configuration, please feel free to reply with your results.
Update
For anyone looking for a working example, you can download this sample project. This project simply creates a 30-second recording of a ‘UITableView‘.
