[Objective-C + Cocoa] iPhone Screen Capture Revisited

Awhile back I posted a handful of simple iOS utilities. Among them was a basic ScreenCaptureView implementation that would periodically render the contents of its subview(s) into a UIImage that was exposed as a publicly accessible property. This provides the ability to quickly and easily take a snapshot of your running application, or any arbitrary component within it. And while not superbly impressive (the iPhone has a built-in screenshot feature, after all), I noted that the control theoretically allowed for captured frames to be sent off to an AVCaptureSession in order to record live video of a running application.

Recently I returned to this bit of code, and the ability to record live video of an application is theoretical no longer. To get straight to the point, here is the revised code:

//
//ScreenCaptureView.h
//
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

/**
 * Delegate protocol.  Implement this if you want to receive a notification when the
 * view completes a recording.
 *
 * When a recording is completed, the ScreenCaptureView will notify the delegate, passing
 * it the path to the created recording file if the recording was successful, or a value
 * of nil if the recording failed/could not be saved.
 */
@protocol ScreenCaptureViewDelegate <NSObject>
- (void) recordingFinished:(NSString*)outputPathOrNil;
@end

/**
 * ScreenCaptureView, a UIView subclass that periodically samples its current display
 * and stores it as a UIImage available through the 'currentScreen' property.  The
 * sample/update rate can be configured (within reason) by setting the 'frameRate'
 * property.
 *
 * This class can also be used to record real-time video of its subviews, using the
 * 'startRecording' and 'stopRecording' methods.  A new recording will overwrite any
 * previously made recording file, so if you want to create multiple recordings per
 * session (or across multiple sessions) then it is your responsibility to copy/back-up
 * the recording output file after each session.
 *
 * To use this class, you must link against the following frameworks:
 *
 *  - AssetsLibrary
 *  - AVFoundation
 *  - CoreGraphics
 *  - CoreMedia
 *  - CoreVideo
 *  - QuartzCore
 *
 */

@interface ScreenCaptureView : UIView {
   //video writing
   AVAssetWriter *videoWriter;
   AVAssetWriterInput *videoWriterInput;
   AVAssetWriterInputPixelBufferAdaptor *avAdaptor;

   //recording state
   BOOL _recording;
   NSDate* startedAt;
   void* bitmapData;
}

//for recording video
- (bool) startRecording;
- (void) stopRecording;

//for accessing the current screen and adjusting the capture rate, etc.
@property(retain) UIImage* currentScreen;
@property(assign) float frameRate;
@property(nonatomic, assign) id<ScreenCaptureViewDelegate> delegate;

@end

//
//ScreenCaptureView.m
//
#import "ScreenCaptureView.h"
#import <QuartzCore/QuartzCore.h>
#import <MobileCoreServices/UTCoreTypes.h>
#import <AssetsLibrary/AssetsLibrary.h>

@interface ScreenCaptureView(Private)
- (void) writeVideoFrameAtTime:(CMTime)time;
@end

@implementation ScreenCaptureView

@synthesize currentScreen, frameRate, delegate;

- (void) initialize {
   // Initialization code
   self.clearsContextBeforeDrawing = YES;
   self.currentScreen = nil;
   self.frameRate = 10.0f;     //10 frames per seconds
   _recording = false;
   videoWriter = nil;
   videoWriterInput = nil;
   avAdaptor = nil;
   startedAt = nil;
   bitmapData = NULL;
}

- (id) initWithCoder:(NSCoder *)aDecoder {
   self = [super initWithCoder:aDecoder];
   if (self) {
       [self initialize];
   }
   return self;
}

- (id) init {
   self = [super init];
   if (self) {
       [self initialize];
   }
   return self;
}

- (id)initWithFrame:(CGRect)frame {
   self = [super initWithFrame:frame];
   if (self) {
       [self initialize];
   }
   return self;
}

- (CGContextRef) createBitmapContextOfSize:(CGSize) size {
   CGContextRef    context = NULL;
   CGColorSpaceRef colorSpace;
   int             bitmapByteCount;
   int             bitmapBytesPerRow;

   bitmapBytesPerRow   = (size.width * 4);
   bitmapByteCount     = (bitmapBytesPerRow * size.height);
   colorSpace = CGColorSpaceCreateDeviceRGB();
   if (bitmapData != NULL) {
       free(bitmapData);
   }
   bitmapData = malloc( bitmapByteCount );
   if (bitmapData == NULL) {
       fprintf (stderr, "Memory not allocated!");
       return NULL;
   }

   context = CGBitmapContextCreate (bitmapData,
                                    size.width,
                                    size.height,
                                    8,      // bits per component
                                    bitmapBytesPerRow,
                                    colorSpace,
                                    kCGImageAlphaNoneSkipFirst);

   CGContextSetAllowsAntialiasing(context,NO);
   if (context== NULL) {
       free (bitmapData);
       fprintf (stderr, "Context not created!");
       return NULL;
   }
   CGColorSpaceRelease( colorSpace );

   return context;
}

//static int frameCount = 0;            //debugging
- (void) drawRect:(CGRect)rect {
   NSDate* start = [NSDate date];
   CGContextRef context = [self createBitmapContextOfSize:self.frame.size];

   //not sure why this is necessary...image renders upside-down and mirrored
   CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
   CGContextConcatCTM(context, flipVertical);

   [self.layer renderInContext:context];

   CGImageRef cgImage = CGBitmapContextCreateImage(context);
   UIImage* background = [UIImage imageWithCGImage: cgImage];
   CGImageRelease(cgImage);

   self.currentScreen = background;

   //debugging
   //if (frameCount < 40) {
   //      NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
   //      NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
   //      [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES];
   //      frameCount++;
   //}

   //NOTE:  to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls
   //       'setNeedsDisplay' on the ScreenCaptureView.
   if (_recording) {
       float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0;
       [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)];
   }

   float processingSeconds = [[NSDate date] timeIntervalSinceDate:start];
   float delayRemaining = (1.0 / self.frameRate) - processingSeconds;

   CGContextRelease(context);

   //redraw at the specified framerate
   [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01];
}

- (void) cleanupWriter {
   [avAdaptor release];
   avAdaptor = nil;

   [videoWriterInput release];
   videoWriterInput = nil;

   [videoWriter release];
   videoWriter = nil;

   [startedAt release];
   startedAt = nil;

   if (bitmapData != NULL) {
       free(bitmapData);
       bitmapData = NULL;
   }
}

- (void)dealloc {
   [self cleanupWriter];
   [super dealloc];
}

- (NSURL*) tempFileURL {
   NSString* outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
   NSURL* outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
   NSFileManager* fileManager = [NSFileManager defaultManager];
   if ([fileManager fileExistsAtPath:outputPath]) {
       NSError* error;
       if ([fileManager removeItemAtPath:outputPath error:&error] == NO) {
           NSLog(@"Could not delete old recording file at path:  %@", outputPath);
       }
   }

   [outputPath release];
   return [outputURL autorelease];
}

-(BOOL) setUpWriter {
   NSError* error = nil;
   videoWriter = [[AVAssetWriter alloc] initWithURL:[self tempFileURL] fileType:AVFileTypeQuickTimeMovie error:&error];
   NSParameterAssert(videoWriter);

   //Configure video
   NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
                                          [NSNumber numberWithDouble:1024.0*1024.0], AVVideoAverageBitRateKey,
                                          nil ];

   NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                  AVVideoCodecH264, AVVideoCodecKey,
                                  [NSNumber numberWithInt:self.frame.size.width], AVVideoWidthKey,
                                  [NSNumber numberWithInt:self.frame.size.height], AVVideoHeightKey,
                                  videoCompressionProps, AVVideoCompressionPropertiesKey,
                                  nil];

   videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];

   NSParameterAssert(videoWriterInput);
   videoWriterInput.expectsMediaDataInRealTime = YES;
   NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
                                     [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

   avAdaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:bufferAttributes] retain];

   //add input
   [videoWriter addInput:videoWriterInput];
   [videoWriter startWriting];
   [videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];

   return YES;
}

- (void) completeRecordingSession {
   NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];

   [videoWriterInput markAsFinished];

   // Wait for the video
   int status = videoWriter.status;
   while (status == AVAssetWriterStatusUnknown) {
       NSLog(@"Waiting...");
       [NSThread sleepForTimeInterval:0.5f];
       status = videoWriter.status;
   }

   @synchronized(self) {
       BOOL success = [videoWriter finishWriting];
       if (!success) {
           NSLog(@"finishWriting returned NO");
       }

       [self cleanupWriter];

       id delegateObj = self.delegate;
       NSString *outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
       NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];

       NSLog(@"Completed recording, file is stored at:  %@", outputURL);
       if ([delegateObj respondsToSelector:@selector(recordingFinished:)]) {
           [delegateObj performSelectorOnMainThread:@selector(recordingFinished:) withObject:(success ? outputURL : nil) waitUntilDone:YES];
       }

       [outputPath release];
       [outputURL release];
   }

   [pool drain];
}

- (bool) startRecording {
   bool result = NO;
   @synchronized(self) {
       if (! _recording) {
           result = [self setUpWriter];
           startedAt = [[NSDate date] retain];
           _recording = true;
       }
   }

   return result;
}

- (void) stopRecording {
   @synchronized(self) {
       if (_recording) {
           _recording = false;
           [self completeRecordingSession];
       }
   }
}

-(void) writeVideoFrameAtTime:(CMTime)time {
   if (![videoWriterInput isReadyForMoreMediaData]) {
       NSLog(@"Not ready for video data");
   }
   else {
       @synchronized (self) {
           UIImage* newFrame = [self.currentScreen retain];
           CVPixelBufferRef pixelBuffer = NULL;
           CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
           CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

           int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
           if(status != 0){
               //could not get a buffer from the pool
               NSLog(@"Error creating pixel buffer:  status=%d", status);
           }
                       // set image data into pixel buffer
           CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
           uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
           CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);  //XXX:  will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data

           if(status == 0){
               BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
               if (!success)
                   NSLog(@"Warning:  Unable to write buffer to video");
           }

           //clean up
           [newFrame release];
           CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
           CVPixelBufferRelease( pixelBuffer );
           CFRelease(image);
           CGImageRelease(cgImage);
       }

   }

}

@end

This class will let you record high-quality video of any other view in your application. To use it, simply set it up as the superview of the UIView(s) that you want to record, add a reference to it in your corresponding UIViewController (using Interface Builder or whatever your preferred method happens to be), and then call ‘startRecording‘ when you are ready to start recording video. When you’ve recorded enough, call ‘stopRecording‘ to complete the process. You will get a nice .mp4 file stored under your application’s ‘Documents’ directory that you can copy off or do whatever else you want with.

Note that if you want to record a UIScrollView while it is scrolling, you will need to implement your UIScrollViewDelegate such that it calls ‘setNeedsDisplay‘ on the ScreenCaptureView while the scroll-view is scrolling. For instance:

- (void) scrollViewDidScroll: (UIScrollView*)scrollView {
       [captureView setNeedsDisplay];
}

I haven’t tested this code on a physical device yet, but there’s no reason why it should not work on any device that includes H.264 video codec support (iPhone 3GS and later). However, given the amount of drawing that it does, it’s safe to say that the more horsepower behind it, the better.

Here is a rather unimpressive 30-second recording of a UITableView that I created using this class (if your browser doesn’t support HTML5, use the link below):

Example iPhone Recording

Lastly, I haven’t tested this class with any OpenGL-based subviews, so I can’t say if it will work in that case. If you try it in this configuration, please feel free to reply with your results.

Update

For anyone looking for a working example, you can download this sample project. This project simply creates a 30-second recording of a ‘UITableView‘.

This entry was posted in coding, objective-c and tagged , , , . Bookmark the permalink.

121 Responses to [Objective-C + Cocoa] iPhone Screen Capture Revisited

  1. Bear says:

    Hi, Aroth
    I tested your code, and it works well. However, I received memory warning. Level=1 message while running on the device. I use iPhone 3GS, which has 140 MB ram free before starting program.

    It eat about 12 mb ram every one second, and I think the memory issue may be a big problem.

    Anyway, thanks for your code, and if I got some way to resolve this issue, I’ll reply you asap.

    Bear

    • aroth says:

      Thanks for the feedback. I was able to confirm your memory leak, and the code should be fixed now.

      The problem was that the data allocated to back the temporary graphics context was never being released. In my defense, the code which introduced this leak was taken from a popular example I found online. Given its popularity I assumed that it must include proper memory management. My mistake.

      In any case, the memory leak should be fixed now if you take the current version of the code. If it isn’t please let me know.

      • Bear says:

        Hi, Aroth

        You’re the man! It works, and the memory issue doesn’t exist anymore.

        Thanks.

        Bear

      • Puma says:

        Hi Aroth,

        I Appreciate for this great example, but i found this memory allocation is more and not released, as u say it is fixed in the latest version, if possible can i get link to the latest code .

        Thanks in advance.

  2. Mika says:

    Hello.
    I am Japanese. In Japan, very little information on the development of iPhone.
    I am very happy to find this article.
    And it works well.
    Very helpful.

    Excuse my English.
    Thank you.

  3. Mika says:

    I’m sorry to comment again and again.

    One is information,

    About this code
    //not sure why this is necessary…image renders upside-down and mirrored
    CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
    CGContextConcatCTM(context, flipVertical);

    UIView CGContextRef is different than the orientation of the coordinate system.
    UIView (UIImage) coordinate system as against the increase down to Y,
    CGContextRef coordinate system is on the increase in Y.

    Although written in Japanese, the reference site.
    http://ameblo.jp/xcc/entry-10460079167.html

    I used to translate Google.
    Sorry bad English.

    Mika

  4. Brandon says:

    Hey aroth,

    Great writeup. I’m getting some errors when trying to build (request for member not a union or stucture) and I think I do not have my superview setup in IB correctly. Can you explain how I do the following from your post?

    simply set it up as the superview of the UIView(s) that you want to record, add a reference to it in your corresponding UIViewController (using Interface Builder or whatever your preferred method happens to be)

    I have a view inside of another view, but can’t figure out how to reference that in IB. Tried connecting it to File’s Owner, but still get the errors.

  5. newlance says:

    I will use this code like this, but got a black movie.

    #import

    @class ScreenCaptureView;

    @protocol ScreenCaptureViewDelegate;

    @interface TestScreencaptureViewController : UIViewController {
    ScreenCaptureView *scrview ;
    int n;
    }

    - (IBAction)startCapture:(id)sender ;

    @end

    @implementation TestScreencaptureViewController

    - (void)viewDidLoad {
    [super viewDidLoad];
    scrview = [[ ScreenCaptureView alloc] init];
    scrview.delegate = self;
    scrview.frame=CGRectMake(0,50, 320, 420);
    [self.view addSubview:scrview];
    }

    • aroth says:

      I suspect your code is not working because you have not added any subviews to to your ‘scrview’. The ScreenCaptureView only records its immediate subviews, so if you don’t add any subviews to it, it will not record anything.

      Also, I updated the main post with a link to a sample project. The direct link is:

      http://codethink.no-ip.org/ScreenCaptureViewTest.zip

      Taking a quick look at how the ScreenCaptureView is set up in the example project might prove helpful.

      • Minesh Purohit says:

        Hello,

        I have use your code. It’s working fine. And thanks for Good Code.
        But Now video not have any voice. If i am playing audio in background then It will only record screen not an audio.

        Can you please help me with this.

        Thanks,
        Minesh Purohit.

  6. listingboat says:

    Hello,
    I’ve been trying to do this myself with my own code, but consistently had an issue on the device (more about that in a sec). So, I unwired my code and plugged in yours. Low and behold, the same issue occurs. It’s this:

    I start recording using your code, but as soon as I play audio with AVAudioPlayer WHILE recording, the very next attempt to grab a frame, this dies:

    int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);

    The avAdaptor.pixelBufferPool which was there and fine all along, suddenly gets set to nil and disappears once the audio starts. Exactly the same problem I had with my own code.

    This only occurs on the device, and not the sim. Which led me to think it’s some type of memory issue, but profiling reveals nothing relevant in instruments. I even left in the allocation of AVAudioPlayer, but once you call the ‘play’ method, boom.

    Any clues would be greatly appreciated.

    • Lakshmikanth Reddy says:

      Hi Listingboat,

      even i m facing the same problem, suddenly, my pixelbufferpool is becoming empty, this is happening only on device, did u found any solution, please help me.

  7. ToradoLeo says:

    Hey nice Tutorial.
    I have to capture the screen of my app thats written in OpenGL. Successfully done that.
    Except with one problem . i need to rotate my video from portrait mode to landscape mode. transform property of the AVAssetWriterInput is not working . Is there another way to do so ?

    • aroth says:

      It would be a fair bit of work and might not give the smoothest result, but perhaps you could manually apply whatever transformations you want in ‘drawRect’ or ‘writeVideoFrameAtTime’? I think that would work, though I’m not sure how forgiving the AVAssetWriter is with respect to things like input frame dimensions and aspect ratios.

    • ekinsol says:

      How did you actually capture the OpenGL screen ? I never get passed “Not ready for video data”

    • localToGlobal says:

      hi

      i’m also interested in capturing OpenGL. just set up an cocos2d scene but the video is always black. i added a UIButton wich is visible, but there’s no trace of my cocos2d scene.

      are there any suggestions to get this done? i would be more than happy!

  8. Mike Hill says:

    Looks interesting but i still dont understand where the video file actually went?
    Sorry just getting into this sort of thing.
    Thanks

    • aroth says:

      The video is saved to a file called “output.mp4″ inside of the application’s “Documents” directory. The exact path to this folder will vary per installation, but if you are running in debug mode with your console open you will get a log message when the recording is saved that will include the exact path to the output file. It will look something like:

      “Completed recording, file is stored at: /path/to/output.mp4″

      …just go to the path that it lists and your video file should be there.

      • Mike Hill says:

        Excellent so im producing a video no i think im almost there but when my video gets made it records all the button presses on screen and also shows my UIImageView default image but does not record my UIImageViews changing animation images when i change them….

        Any ideas?

        • aroth says:

          Have you tried calling ‘setNeedsDisplay’ on the ScreenCaptureView while your animations are running? I suspect you are probably running into the same issue that prevents UIScrollView animations from being captured, namely that the superview does not get redrawn while the subview is animating. Probably this was done as a performance optimization, but it obviously has a detrimental impact on this sort of recording application.

          • Mike Hill says:

            I guess your right but i cant work out how to set it up.
            I even tried just an NSTimer calling [captureView setNeedsDisplay]; over and over but cant get any movement.

            Oh well back to the drawing board.

            Thanks

          • Sandy says:

            Hey aroth thanks for submitting your zip files… it was very help full … But my Question is there were any way to record the screen with sound and make a .mp4 or amy other format ???

  9. Satish says:

    Thanks bro its awesome article… will be in touch with your posts for most exciting things… I was looking for streaming live camera video from iPhone A to iPhone B through intermediate streaming server. Can you pls suggest me how can i achieve this??

    • yunas says:

      hey satish…
      I think for this you have to record the screen for few seconds let say 10 and then upload mean while you are uploading you are recording the view again then upload.
      Or are you trying some other approach ?

  10. yunas says:

    Its a very neat and clean code plus pretty easy in reusability sense.
    In my app. I am just recording what user is doing. The user sees a view which is a “ScreenCaptureView” and there is a button on this view which on tap opens the photo library. The problem is that when the library is opened. It is not recorded and only the output(selected image) is shown as I am displaying the selected image on the “ScreenCaptureView”. Any ideas how to tackle this problem ? How can I record the photolibrary picker too ?

    • aroth says:

      Hm…I think you’d have to get the photo picker set up as a subview of the ScreenCaptureView. I’m not sure how doable that is.

      Alternately, you might try creating a UIWindow subclass which manages a ScreenCaptureView instance and which forces the ScreenCaptureView to be used as the root view of whatever the currently active view controller happens to be. This would obviously involve a lot of shuffling around of subviews if/when the view controller changes, but if it works it should let you record pretty much anything that happens in the application.

  11. sumeet says:

    m using a mac mini which does not have an in built mic…so just wanted to knw that will the video also have the audio if i attach a external mic to my machine.. ????

  12. Lakshmikanth says:

    Hi Aroth,

    I need your help badly, its crashing on the device, due to memory problem, if possible can u send me the memory fixes that needs to be done.

  13. Dallas Brown says:

    Hey aroth,

    Thanks for the code sample.

    I am trying to implement the above but as soon as I start recording I get a never-ending stream of log messages saying: “Not ready for video data”
    Then when I stop recording I get just a black video.

    I am trying the following:

    capture = [[HBIScreenCaptureView alloc] initWithFrame:CGRectMake(0, 0, 1024, 768)];
    [self.view addSubview:capture];

    playerView = [[MRSPlayerView alloc] initWithFrame:CGRectMake(0, 0, 1024, 768)];
    [playerView setBackgroundColor:[UIColor clearColor]];
    [playerView setTag:1];
    [capture addSubview:playerView];

    [capture performSelector:@selector(startRecording) withObject:nil afterDelay:1.0];

    My MRSPlayerView plays a video using AVFoundation and AVPlayer.

    Any ideas?
    Thanks!

    • Vikas Gupta says:

      Hey Dallas Brown,

      I am also trying this to record my player video which is play on screen but now he give me a black screen video.
      if you achieve this to record the video screen then let me know. how you do this.

  14. John says:

    Hey, this is awesome stuff: I used it to grab video from a simulator I’m working on. Thanks!

    A question: I’d actually prefer to capture one frame of the video each time the simulation updates (which can be a while for some sims–so boring to watch an unchanging screen). The way I copy/pasted your code, I could fire off a writeVideoFrameAtTime at each sim update, but when I (probably naively) tried to do this, the video is unplayable. I expect this has something to do with frame rates and when the mp4 file expects itself to be fed new data.

    Do you see a quick way to redo your code to allow me to capture each frame when I prefer, rather than at a preset time interval? Will this screw up the resulting file’s timecode? Neophyte questions, I’m sure!

    Thanks again for the code!

  15. happyday67 says:

    Your blog is one of a kind, i love the way you organize the topics.`’

  16. jsmith says:

    Thank you a lot for this great article!
    But I cannot use it on iphone 3. Do you have any idea? Thanks again :p

  17. magtak says:

    Hello, just stumbled upon your code while googling for the following thing:

    i was asked to create an app that records whatever is on the screen while on the background.
    essentially the goal seems to be to create screencasts of other apps (possibly with user commentary).
    since you’ve already meddled with video recording in general, would you happen to know if this is possible and/or guide me to any source about that?

    Thanks in advance.

  18. sachith says:

    Thank you for the great tutorial. Can you guide me how to record audio while recording screen ?

  19. sachith says:

    To record screen in which Core Animation is used.
    use [[self.layer presentationLayer] renderInContext:context] in drawRect() instead of   [self.layer renderInContext:context] in file ScreenCaptureView.m

  20. Armando says:

    Great work. What would I need to capture the popovers?

  21. Some truly prize posts on this internet site , saved to my bookmarks .

  22. Hi, i’m trying to use your code to capture video from a app that uses a PageViewController. I’ve added the controller as a subview of your ScreenCaptureView … but in the output I don’t have the page-turning effects. Can you help me a bit ?

  23. Kevin Xue says:

    Awesome!! That works fine for me, but i faced another crash problem, i dont think i have any UIWebview in subviews, but this issue always crash me, no matter in simulator or device, any idea about this ?

    4 WebCore _ZN7WebCore9TileCache13doLayoutTilesEv 23
    5 WebCore -[TileHostLayer renderInContext:] 52
    6 QuartzCore -[CALayer _renderSublayersInContext:]

  24. Jaya says:

    Is there any possibility to embed audio along with the captured video?

  25. Hello! I simply would like to give an enormous thumbs up for the good data you’ve right here on this post. I will probably be coming back to your weblog for more soon.

  26. I’m impressed, I have to say. Actually rarely do I encounter a weblog that’s both educative and entertaining, and let me tell you, you’ve gotten hit the nail on the head. Your idea is outstanding; the difficulty is one thing that not sufficient persons are talking intelligently about. I am very comfortable that I stumbled across this in my search for something regarding this.

  27. Pingback: Capture AVPlayer movie frames in realtime

  28. sky says:

    Hi.
    m_playerSlow = [[AVPlayer alloc] initWithPlayerItem:playerItem]; m_playerSlow.actionAtItemEnd = AVPlayerActionAtItemEndNone;
    m_view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, m_viewCapture.frame.size.width, m_viewCapture.frame.size.height)]; AVPlayerLayer * _avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:m_playerSlow]; _avPlayerLayer.frame = CGRectMake(0, 0, m_view.frame.size.width, m_view.frame.size.height); [m_view.layer addSublayer:_avPlayerLayer]; [/*self.view*/m_viewCapture addSubview:m_view]; [playerItem release]; // add Draw View m_viewDraw = [[DrawView alloc] initWithFrame:m_view.frame]; m_viewDraw.m_parent = self; m_viewDraw.userInteractionEnabled = YES; m_viewDraw.backgroundColor = [UIColor clearColor]; [m_viewCapture addSubview:m_viewDraw];

    m_plyerSlow is AVPlayer, m_viewDraw is UIView to draw some shapes, and the m_viewCapture is the ScreenCaptureView.

    When recording, I can record the shapes of m_viewDraw, but nothing with the m_playerSlow(AVPlayer).

    What can I do? Please help me.

  29. Pingback: Convert UIView in to EAGLView?

  30. Tj Fallon says:

    For some reason this code won’t correctly record 3D, like text with depth.

    If I fix it, I’ll post again.

  31. Tj Fallon says:

    So yeah, this code is wonderful, I’ve learned quite a bit from it but alas the way renderInContext is called any CATransform3Ds that have been applied to the layer are not rendered. This means that all the perspective transforms I’ve put into my app don’t show up in the video.

    I’ve tried called renderInContext on presentationLayer, and on superLayer.presentationLayer, both with the same results. Any ideas? I really can’t rewrite the application ground up with OpenGL, and I’ve been chugging on strong for about 6 hours trying to find a solution.

    I did manage to augment this code to record audio playing from the speakers, however it’s a bit of a hack, if anyone needs it.

    • frowing says:

      Hey TJ Fallon,

      I would love to see that code of yours recording audio & video. Could you post it here or somewhere else and leave a link to it?

      It would be really appreciated

      Thanks!

      • Tj Fallon says:

        The trick is to record the audio completely separately and then mix them afterwards. Record the audio however you want, save it to the documents folder, and then call this with the path for your video and path for your audio.

        -(void)mixAudio:(NSString*)audio withVideo:(NSString*)video{

        AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:audio] options:nil];
        AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:[NSString stringWithFormat:@"file://localhost%@", video]] options:nil];

        AVMutableComposition* mixComposition = [AVMutableComposition composition];

        AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
        preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
        ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
        atTime:kCMTimeZero error:nil];

        AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
        preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
        atTime:kCMTimeZero error:nil];

        AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
        presetName:AVAssetExportPresetPassthrough];

        NSString* videoName = @”export.mov”;

        NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
        NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];

        if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
        {
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
        }

        _assetExport.outputFileType = @”com.apple.quicktime-movie”;

        _assetExport.outputURL = exportUrl;
        _assetExport.shouldOptimizeForNetworkUse = YES;
        NSLog(@”exportURL:%@”,exportUrl);
        [_assetExport exportAsynchronouslyWithCompletionHandler:
        ^(void ) {

        NSString *path = [NSString stringWithString:[exportUrl path]];
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (path)) {
        NSLog(@”Path:%@”,path);
        UISaveVideoAtPathToSavedPhotosAlbum (path, nil, nil, nil);
        }

        UIAlertView*alert = [[UIAlertView alloc]initWithTitle:@”Success!” message:@”Your video has been saved to your camera roll!” delegate:self cancelButtonTitle:@”Fucking Sweet.” otherButtonTitles:nil];
        [alert setTag:666];

        [alert show];
        [self.view setUserInteractionEnabled:YES];
        [UIView animateWithDuration:1 animations:^{
        [introText setAlpha:1];
        [introView setAlpha:1];
        [showButtonsButton setHidden:NO];
        } completion:^(BOOL finished){
        [self expandIntroView:nil];
        creatingVideo = NO;
        recordWithAudio = NO;
        }];
        /*
        UIAlertView*alert = [[UIAlertView alloc]initWithTitle:@”Success!” message:@”Your video has been saved to your camera roll, would you like to share it now?” delegate:self cancelButtonTitle:@”No” otherButtonTitles:@”Email”,@”Facebook”, nil];
        [alert setTag:666];

        [alert show];*/
        // your completion code here
        }
        ];
        }

        • Mustafa says:

          … and how do you record the audio? The audio recording documentation suggests that you can record the audio coming in through the microphone. I’ll appreciate if you can post your solution of recording the audio (which is being played) as well. I’m sure it’ll be extremely helpful to others as well.

        • Johnykutty says:

          I also need what @mustafa said can you explain or give a way for tthat

    • Jalan says:

      Hey TJ Fallon,

      Did you success in recording CATransform3Ds applied layer.

      If yes then can you please share it :)

      Regards
      Jalan

  32. Pingback: Record audio while recording OpenGL view

  33. Pingback: Record Audio using AVCaptureSession

  34. Kevin says:

    Hi Aroth,

    I noticed that in createBitmapContextOfSize you free/malloc on every function call. This seems inefficient. I understand why this is important if the given size changes, but for many practical purposes, when would that ever happen?

    In this code specifically, self.frame.size is always being passed in. Would that only ever change if the screen orientation changed?

    Thanks,
    Kevin

  35. Jalan says:

    Hi Aroth

    Very nice sample and its working well and good in normal case.
    But i tried to record a screen on which several animations are going on and it was not able to record the animations running on it.

    I hope this line of the code is used to enable capturing of screen on certain interval?
    [videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];

    So i tried it after changing the value as below also but no success.
    [videoWriter startSessionAtSourceTime:CMTimeMake(0, 10)];

    Can you suggest me what could be the problem.

    Regards
    Jalan

  36. Alex says:

    Hi, thanks for your code. I tried to add a simple animation (consecutive images ) as a subview of a UITableViewTest controller but only table view with touches are recorded not the subview animation. Here is my code

    - (void)viewDidLoad {
    [super viewDidLoad];
    [captureView performSelector:@selector(startRecording) withObject:nil afterDelay:1.0];

    imageArray = [[NSMutableArray alloc] initWithCapacity:IMAGE_COUNT];

    // Build array of images, cycling through image names
    for (int i = 0; i < IMAGE_COUNT; i++)
    [imageArray addObject:[UIImage imageNamed:[NSString stringWithFormat:@"Bullet%d.tif", i]]];

    // Animated images – centered on screen
    animatedImages = [[UIImageView alloc]
    initWithFrame:CGRectMake(100 ,-120,320,480)];
    animatedImages.animationImages = [NSArray arrayWithArray:imageArray];

    animatedImages.animationDuration = 4.0;
    animatedImages.animationRepeatCount = 5;

    // Add subview
    [self.view addSubview:animatedImages];

    // Start it up
    animatedImages.startAnimating;

    [captureView performSelector:@selector(stopRecording) withObject:nil afterDelay:10.0];

    }

    Any help?

  37. Pingback: Recording from the iPad screen | LiquidSketch development blog

  38. Jackie says:


    @protocol ScreenCaptureViewDelegate
    - (void) recordingFinished:(NSString*)outputPathOrNil;
    @end

    should the outputPathOrNil be NSURL object? I’ve found this in downloaded code to.

    Thanks for great code, works like a charm :)

  39. peter says:

    hello..developer

    i am korean , it helps me heavily.
    but i want to add audio recorder while recording the video,,

    is thera any solution? if it ware, please help me…

    advancely thank you…

    park3314@naver.com.

    you are a real developer…oh~~..

    Sincerely..

  40. ramya says:

    i’am using ScreenCaptureView in my application for recording a video that is playing with MPMoviePlayerController.
    The video is recoring but, it is getting only the black screen, not the video content.

    For getting the video recoding what i have to do ..???

    Please help me…

  41. Raja says:

    Excellent so im producing a video no i think im almost there but when my video gets made it records all the button presses on screen and also shows my UIImageView default image but does not record my UIImageViews changing animation images when i change them….

    Plz give me reply……..
    Thanks&Regards

  42. siva says:

    Hi this is excellent,
    just give me a solution to play audio on background …………..

    thanks a lot….

  43. Mohit says:

    Plz tell me how to record the animation of images in screen…

  44. Prasad Lodha says:

    Hi Aroth,
    Great post. Exactly what we needed. Thanks. Much appreciated.

  45. Shaktising Pardeshi says:

    Hi Aroth,
    Is it possible with your code to record live video running on the screen with overlaying objects?
    Please reply.
    Thanks

  46. Arun says:

    Hi Aroth,

    Great post. Thanks.

  47. bob says:

    Hi, I was wondering if you know how to tweak it so that the movie.mp4 will save to the cameraroll. What would you change/add?

    Thanks.

  48. T.D. says:

    Hey,

    I am having trouble connecting the code to the corresponding parts in interface builder. I am building a tabbed bar application and I would like to record every view the user clicks on simultaneously. So, I figured that I would need to add the screenCaptureView to every tab I have and then connect it in interface builder but for some reason when I click on the connects outlets under files owner and try to drag it to the view nothing happens.

    Can you help with with this problem?

    Thanks in advance,

    T.D.

  49. John says:

    Hi! Aroth.. I tried your code and it workd . But its difficult to implement on other views especially when transitioning to different views. I found another way . found this code. and modfy. it workd but the transition to other view is not captured like “UIModalTransitionStyleFlipHorizontal”.. the animations were captured and some transition s tyle.

    Can anyone help me?.figure this out?.


    - (UIImage*)screenshot
    {
    // Create a graphics context with the target size
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    CGSize imageSize = [[UIScreen mainScreen] bounds].size;
    CGFloat imageScale = imageSize.width / FRAME_WIDTH;
    if (NULL != UIGraphicsBeginImageContextWithOptions)
    UIGraphicsBeginImageContextWithOptions(imageSize, NO, imageScale);
    else
    UIGraphicsBeginImageContext(imageSize);

    CGContextRef context = UIGraphicsGetCurrentContext();

    // Iterate over every window from back to front
    for (UIWindow *window in [[UIApplication sharedApplication] windows])
    {
    if (![window respondsToSelector:@selector(screen)] || [window screen] == [UIScreen mainScreen])
    {
    // -renderInContext: renders in the coordinate space of the layer,
    // so we must first apply the layer's geometry to the graphics context
    CGContextSaveGState(context);
    // Center the context around the window's anchor point
    CGContextTranslateCTM(context, [window center].x, [window center].y);
    // Apply the window's transform about the anchor point
    CGContextConcatCTM(context, [window transform]);
    // Offset by the portion of the bounds left of and above the anchor point
    CGContextTranslateCTM(context,
    -[window bounds].size.width * [[window layer] anchorPoint].x,
    -[window bounds].size.height * [[window layer] anchorPoint].y);

    // Render the layer hierarchy to the current context
    //[[window layer] renderInContext:context];
    [[[window layer] presentationLayer] renderInContext:context];

    // Restore the context
    CGContextRestoreGState(context);
    }
    }

    // Retrieve the screenshot image
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    return image;
    }

    #pragma mark helpers
    -(NSString*) pathToDocumentsDirectory {
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    return documentsDirectory;
    }

    -(void) writeSample: (NSTimer*) _timer {
    if (assetWriterInput.readyForMoreMediaData) {
    // CMSampleBufferRef sample = nil;

    CVReturn cvErr = kCVReturnSuccess;

    // get screenshot image!
    CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
    NSLog (@"made screenshot");

    // prepare the pixel buffer
    CVPixelBufferRef pixelBuffer = NULL;
    CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
    NSLog (@"copied image data");
    cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
    FRAME_WIDTH,
    FRAME_HEIGHT,
    kCVPixelFormatType_32BGRA,
    (void*)CFDataGetBytePtr(imageData),
    CGImageGetBytesPerRow(image),
    NULL,
    NULL,
    NULL,
    &pixelBuffer);
    NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

    // calculate the time
    CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
    NSLog (@"elapsedTime: %f", elapsedTime);
    CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);

    // write the sample
    BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

    if (appended) {
    NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
    } else {
    NSLog (@"failed to append");
    [self stopRecording];
    //self.startStopButton.selected = NO;
    }
    }
    }

    -(void) startRecording {

    // // create the AVComposition
    // [mutableComposition release];
    // mutableComposition = [[AVMutableComposition alloc] init];

    // create the AVAssetWriter
    NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath]) {
    [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }

    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;
    //[assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
    fileType: AVFileTypeQuickTimeMovie
    error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    AVVideoCodecH264, AVVideoCodecKey,
    [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
    [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
    nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
    outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];

    //[assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
    initWithAssetWriterInput:assetWriterInput
    sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];

    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];

    // start writing samples to it
    //[assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
    target:self
    selector:@selector (writeSample:)
    userInfo:nil
    repeats:YES] ;

    }

    -(void) stopRecording {
    [assetWriterTimer invalidate];
    assetWriterTimer = nil;

    [assetWriter finishWriting];
    NSLog (@"finished writing");
    }

  50. Ozzy says:

    Hey Aroth! I found another memory leak in the code. Took awhile to figure out what it was.

    In:

    -(void) writeVideoFrameAtTime:(CMTime)time {
    ...

    CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

    //The CGImageGetDataProvider returns a pointer to new data, which must be released.
    //So:
    CGDataProviderRef provider = CGImageGetDataProvider(cgImage);
    CFDataRef image = CGDataProviderCopyData(provider);

    ...
    CGDataProviderRelease(provider);

  51. Harshal Kothari says:

    Hey Aroth,

    The code works like a charm on iPad 2 but not on iPad 3.
    It hangs while running on iPad 3.
    Any suggestion??

    Thank you
    Harshal.

  52. sam says:

    Hey! I love this code, and I’m about to integrate it (temporarily) into my app to make some screen recordings for promos. However, I noticed that the output video doesn’t actually appear in iTunes anymore. The solution is simple, though — add “UIFileSharingEnabled” to the info.plist! Just thought you might want to know.

  53. Jason says:

    First of all, your code works great. I’m able to record the contents of a UIView without any problems.
    I’m trying to add a front camera preview to my UIView kinda like the video in FaceTime. I’m using AVCaptureSession and AVCaptureVideoPreviewLayer to display the video feed. I add AVCaptureVideoPreviewLayer to a UIView (videoPreviewView below) and make it add it to my superview. The UIView (videoPreviewView) records but the feed from the camera does not.

    Any ideas?

    Heres my code (it uses ARC):


    -(void)startFrontCamera {
    AVCaptureDevice *device = [self frontFacingCameraIfAvailable];

    if (device) {
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetMedium;

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
    // Handle the error appropriately.
    NSLog(@"ERROR: trying to open camera: %@", error);
    } else {
    if ( [session canAddInput:input] ) {
    [session addInput:input];

    // Add the video frame output
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];

    // Specify the pixel format
    output.videoSettings =
    [NSDictionary dictionaryWithObject:
    [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
    forKey:(id)kCVPixelBufferPixelFormatTypeKey];

    // If you wish to cap the frame rate to a known value, such as 15 fps, set
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);

    AVCaptureConnection *videoConnection = [output connectionWithMediaType:AVMediaTypeVideo];
    if ([videoConnection isVideoOrientationSupported] )
    {
    [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    }

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

    float kViewHeight = CGRectGetHeight([UIScreen mainScreen].applicationFrame);
    //iPhone 5!!!!!!!!!!!!!!!
    if (kViewHeight >= 548) {
    videoPreviewView = [[UIView alloc] initWithFrame:CGRectMake(195, 293, 120, 160)];
    } else {
    videoPreviewView = [[UIView alloc] initWithFrame:CGRectMake(195, 215, 120, 150)];
    }
    [videoPreviewView setBackgroundColor:[UIColor blackColor]];
    videoPreviewView.layer.cornerRadius = 5;
    videoPreviewView.layer.masksToBounds = YES;

    captureVideoPreviewLayer.frame = self.videoPreviewView.bounds;
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [self.videoPreviewView.layer addSublayer:captureVideoPreviewLayer];

    [session startRunning];

    [self.view addSubview:videoPreviewView];
    } else {
    NSLog(@"Couldn't add input");
    }
    }
    } else {
    NSLog(@"Front camera not available");
    }
    }

  54. Jack Solomon says:

    Hi,
    Is there a way to save the video to camera roll once recorded, or have a playback button? Any help would be hugely appreciated, and I am happy to give you credit in the finished app (on the app store).
    Feel free to email at jasolomon@99centsappdevelopment.com

    • saurabh Singh says:

      in completion finish method put
      UISaveVideoAtPathToSavedPhotosAlbum(outputPath, nil, nil, nil);
      and eliminate the line saving to document directory.

  55. Mohit says:

    Is it possible in windows phone 8?Please if any idea share with me.
    Thanks

  56. Mohit says:

    Is It possible windows phone?please suggest me how it will be possible?

  57. D says:

    Is It possible in windows phone 8?

  58. saurabh Singh says:

    UISaveVideoAtPathToSavedPhotosAlbum(outputPath, nil, nil, nil);

  59. saurabh Singh says:

    i am unable to record the camera overlay view it shows only black screen . else is working good. anyone have idea?

  60. Jan says:

    Hi Aroth,

    This is a nice piece of code. Could you tell me under which license is the code?

  61. Dhiren Shah says:

    Hi,

    First of all, Thanks for sharing the code…

    I able to capture screen video and audio and able to get drag and drop UIImages on ScreenCaptureView. Everything is working fine for now, except the performance with drawRect: Method.

    Any one know how to achive performance in drawRect: method??

    I have already tried running this piece of code in Background…. which somewhat gains the performance… (but not really good).

    Also tried CGBitmapContextCreateImage(context); and getting image in that way.. but here my video quality is getting degraded…. but performance is good… Anyone having idea how to use CGBitmapContextCreateImage(context); smart enough to get same output of video with good enough performance???

  62. anta says:

    How can I perform screen capture without the XIB file? I need to record a view that is created programatically.

    captureView = [[ScreenCaptureView alloc] initWithFrame:CGRectMake(0, 0, 200, 200)];
    [captureView addSubview:self.videoPlayerView];
    [captureView performSelector:@selector(startRecording) withObject:nil afterDelay:1.0];
    [captureView performSelector:@selector(stopRecording) withObject:nil afterDelay:3.0];

  63. Diyanshi says:

    Is it possible in Windows phone 8?Anyone have idea please share with me..

  64. As was mentioned at the start of this article, due to today’s bleak economic times, everyone is looking for ways to save money. Taking advantage of coupons is one of the best ways to save. By applying the advice and info you’ve gleaned from this article, you will be able to save big bucks.

  65. srinu says:

    Hi I have tried the same code on one view in which I have OpenGLview ( Landscape mode ) but it is giving some unweired results i.e some cross lines across the view in the video . Please help me to solve this issue .

  66. srinu says:

    Hi I have tried the same code on one view in which I have OpenGLview ( Landscape mode ) but it is giving some unweired results i.e some cross lines across the view in the video . Please help me to solve this issue ( This problem occurs when the view is in the Landscape mode but don’t have any issues if the view is in Landscape mode ).

  67. Jignesh says:

    I have download sample project and try to run that work completely also generate output.mp4 file here but that can’t play. please give solution

    Thanks in advanse.
    Jignesh

    • srinu says:

      Hi Jignesh me also have faced the same issue . Let me know which player you are using . It is better to use the VLC player .

  68. Deepa Mittal says:

    Hello ,
    When I am Calling startRecording method ,it is not recording whatever is happening on my screen
    What to do??

  69. Neha Mangal says:

    Hello ,

    My output.mp4 is generated properly but it is not getting played with any of player. :(
    What is the problem please give a proper solution…

  70. Sandip says:

    Hi Aroth,
    this code is nice and very much helpful for me
    now i need some methode through which i can pouse screen capturing process. means when i press pouse button screen will not send in buffer and after some time when press record it starts recording from resumed point .
    i found a lot but did’t find any code that can fulfill my problem. please help me if you can
    thanx
    sandip trivedi

  71. CryptX says:

    hello,
    Thanks alot for your information. I tried this in my app, and it makes the app super laggy & frame rate drops very low. Its a drawing app & I want to record the user strokes & shape manipulations. Do you have other suggestions for this kind of app ? or perhaps a tweak ?

  72. Vikas Gupta says:

    Hi Aroth,

    Can we record screen of MPMoviePlayerController. If it can done by this project then how i can achieve this. because when i m using ScreenCaptureView class then i got only black screen video.

  73. Ehsan says:

    hello

    i tried your code but how i start the recording and stop it then play the recorded video ?

    thanks

  74. Pingback: Creating a smooth screencast video on iOS from UIView - QueryPost.com | QueryPost.com

  75. Jonathan says:

    Do anyone have the codes to record audio…I tested some but I do not understand. Thanks Thanks

    My email is jonathanng1989@gmail.com

  76. Sean says:

    I feel like if you recorded a brief tutorial video it would clear up most of these questions and really help users implement this code correctly. Thanks for the great code!

  77. Pingback: iPhone: Cocos2d & UIKit Code Integration - QueryPost.com | QueryPost.com

  78. Pingback: Cropping an image without using an image mask? | Technology & Programming

  79. Gilad K says:

    Hi
    This code works perfect , but it doesn’t recored my streaming video camera.
    I work with GPUImage for rendering the video camera to the screen and I want to record the screen with that video.What happen is that everything is recorded except the camera video.
    Anyone knows how to fix this?
    Thanks in advance
    Gilad

  80. Usman Nisar says:

    I am trying to capture a video of a subview not a full screen. like a subview of 320×320 at screen middle. the problem i face that the created video blinks. can you please help about it.

    • Usman Nisar says:

      let me explain my point in brief:
      In your example project, I added a transparent uiview of frame: 0,70,320,320 in xib file. move the tableview from parent to this added view. Now the tableview is a subview of my newly added uiview. remove the reference of screencapture class from the parent and assign to the newly added view.

      Now the code work perfectly if do not scroll the tableview. For clicking any row the generated video quality is OK. but if I do scrolling during recording, here then the quality of generated video is disturbed (some green shaded blinking etc).

      Hope you guys go my point. can any one tell what i am going wrong. does the video generation code needs to be changed for changing frame size of capture view.

      Thanks in advance

      • Usman Nisar says:

        Hi everyone. here with an update about my issue. The problem is occurring only for ios7. for earlier everything works fine (what ever we set the size of over view). Don’t know the exact issue. any help regarding to this is appreciated.

  81. Ganidu says:

    Hi

    Thanks for the tutorial . Its really great . Is it possible to record it more than 30 seconds .

    Thank you

  82. joan says:

    hi pre ios6 it worked fine but now i get

    WARNING: -finishWriting should not be called on the main thread.

    I fixed it by maybe you can implement?:

    #if (!defined(__IPHONE_6_0) || (__IPHONE_OS_VERSION_MAX_ALLOWED < __IPHONE_6_0))
    // Not iOS 6 SDK
    BOOL success =[videoWriter finishWriting];

    #else
    // iOS 6 SDK
    [videoWriter finishWritingWithCompletionHandler:^(){
    // Running iOS 6
    NSLog (@"finished writing");

    }];

    #endif

  83. Syah says:

    Hi Aroth,

    I tried your code on iOS7 iPad3, there were a couple of changes that I made because the app kept on crashing whenever I tried to record.

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    AVVideoCodecH264, AVVideoCodecKey,
    [NSNumber numberWithInt:320], AVVideoWidthKey,
    [NSNumber numberWithInt:480], AVVideoHeightKey,
    videoCompressionProps, AVVideoCompressionPropertiesKey,
    nil];

    2. Once I specified the height and width, it no longer crashes, but I cannot play the video after running your app on iPad simulator. When I ran the app on iPad 3, I cannot see the video under the app documents too.

    3. BOOL success = [videoWriter finishWriting]; need to be changed too as it is depreciated.

    Kindly assist. :)

  84. wizwik says:

    I am trying run the demo code under IOS 7. I set up a test program using a view controller in place of a table view controller. Then I added ScreenCaptureView as a sub view and finally a simple drawing class that draws a red and a green line, as a second subview. I type cast the CGBitmapInfo in CGBitmapContextCreate with (CGBitmapInfo)kCGImageAlphaNoneSkipFirst. I then started the movie recording with a delay in the viewdidLoad(). A recorded file is produced as expected by the red and green line appear as multiple sloping dashed lines across the movie. I think it is a pixel alignment problem. I have tried all the CGBitmapInfo settings in the documentation but no luck. I have run the demo code ok. Has anyone seen this before?

  85. wizwik says:

    More… I went into the drawRect routine and turned on the debugging lines of code. This writes out the images to a png filed for the first forty frames. I inspected the frames and they are correct. My capture view size is set to 200 X 200 and that is the size of the images. So the problem must be where I pass the images to the video recorder.

  86. wizwik says:

    More.. It seems like the width is the problem. My initial size was 200 X 200 and it had a problem. Based on Syah’s comments (see above) I changed my dimensions. I started at 320 X 480 and then went to to 300 x 300. Here are some results;
    //view_rect=CGRectMake(10,30, 300, 480); worked
    //view_rect=CGRectMake(10,30, 200, 200); // does not work
    //view_rect=CGRectMake(10,30, 200, 300); // Failed
    //view_rect=CGRectMake(10,30, 300, 300); // ok
    //view_rect=CGRectMake(10,30, 300, 100); // ok green flash!
    //view_rect=CGRectMake(10,30, 300, 50); // ok !
    //view_rect=CGRectMake(10,30, 250, 50); // ok green flash!
    //view_rect=CGRectMake(10,30, 225, 50); // Vertical failed!
    // view_rect=CGRectMake(10,30, 220, 50); // ok
    //view_rect=CGRectMake(10,30, 210, 50); //failed
    //view_rect=CGRectMake(10,30, 205, 50); // ok
    //view_rect=CGRectMake(10,30, 200, 50); // failed
    view_rect=CGRectMake(10,30, 300, 300); // Ok Ipad, Iphone
    //view_rect=CGRectMake(10,30, 190, 50); // ok
    Note how at 210 it failed and then it was ok at 205. I thought it may be an aspect ratio problem but I could not prove that. I am going to keep my widths up! I think the code is fine.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>