Unity 3D Game Development by Example Video Review

Unity 3D Game Development by Example is a Video lecture by Adam Maxwell (Packt has videos now, apparently).

It has 8 sections, each around 20 minutes, which covers the basics of Unity 3D game engine.

The author or narrator explains everything from the very basic things, such as how Unity’s user interface looks like, where to find everything, to more complex things like how to write scripts, how to make Title screens and Menus, how to save and load data for your games through examples.

The narration is paired with “slides” that in bullet points or diagrams that help explain some topics. And of course, the video also demonstrates concepts using the game engine itself, so it’s easy for listeners to follow and understand.

The narration for me though, is a little flat, but it’s still better than reading books, and simply following through screenshots.

For beginners, I think this video lecture is a good place to start. But people who are already familiar with Unity, this video doesn’t offer much more.

You can check out the video lecture on Packt’s website or check out some sample sections on YouTube.

Advertisements

Mini Tutorial: How to capture video of iPhone app in Cocos2D? with audio

Okay, so I figured out how to add audio to my video.

In my previous blog post (http://purplelilgirl.tumblr.com/post/10974345146/mini-tutorial-how-to-capture-video-of-iphone-app-in), I managed to take a video of my app and save it into a file. However I am just stringing together screenshots of my app taken at every 0.1 second, so it doesn’t capture the audio.

So I have a different function that is capturing my audio (AVAudioRecorder), and saving that into a file.

Now, to combine the files together. Since iOS 4.1, AVFoundation included this thing called AVMutableComposition, and with that you can make composites of stuff, like combine video and audio files together to make a new video file that has audio.

So code bits (I found bits of the code in StackOverflow):

-(void) processVideo: (NSURL*) videoUrl
{  
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];
   
    AVMutableComposition* mixComposition = [AVMutableComposition composition];
   
    AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
   
    NSError * error = nil;
   
    for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
    {
¬†¬†¬†¬†¬†¬†¬† NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];
       
        AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
       
        AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID: kCMPersistentTrackID_Invalid];
       
¬†¬†¬†¬†¬†¬†¬† NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);
       
¬†¬†¬†¬†¬†¬†¬† CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);
       
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];     
    }
   
   
    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                   ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                    atTime:kCMTimeZero error:nil];
   
    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                          presetName:AVAssetExportPresetPassthrough];  
   
¬†¬†¬† NSString* videoName = @”export.mov”;
   
    NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
    NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];
   
    if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
    {
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
    }
   
¬†¬†¬† _assetExport.outputFileType = @”com.apple.quicktime-movie”;
¬†¬†¬† NSLog(@”file type %@”,_assetExport.outputFileType);
    _assetExport.outputURL = exportUrl;
    _assetExport.shouldOptimizeForNetworkUse = YES;
   
    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (_assetExport.status)
         {
             case AVAssetExportSessionStatusCompleted:
                 //export complete
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† NSLog(@”Export Complete”);
                 //[self uploadToYouTube];
                
                 break;
             case AVAssetExportSessionStatusFailed:
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† NSLog(@”Export Failed”);
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export error (see exportSession.error) 
                 break;
             case AVAssetExportSessionStatusCancelled:
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† NSLog(@”Export Failed”);
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export cancelled 
                 break;
         }
     }];   
}

I have more than one audio file that I want to combine with my video, so I created a array file that contains information for each of the audio files (such as where the file is located and when to play that audio).

And that’s it ūüôā You have a video of your app ūüôā with audio ūüôā

Mini Tutorial: How to capture video of iPhone app in Cocos2D?

Someone asked me before if I knew how to do record the screen in Cocos2d as a video. I didn’t know how to record a video, so this guy sent me some codes, but his problem is that his code is recording the screen (taking screenshots) as a UIWindow. So my idea for him was to replace his screenshot code with AWScreenshot (by Manucorporat, search the Cocos2d forums for his code).

And here are the code bits:

#import <AVFoundation/AVFoundation.h>
#import <AVFoundation/AVAssetWriter.h>
#import <CoreVideo/CVPixelBuffer.h>
#import <CoreMedia/CMTime.h>

#import “AWScreenshot.h”

#define FRAME_WIDTH 320
#define FRAME_HEIGHT 480
#define TIME_SCALE 60 // frames per second

-(void) startScreenRecording
{  
¬†¬†¬† NSLog(@”start screen recording”);
   
    // create the AVAssetWriter
¬†¬†¬† NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent: @”video.mov”];
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
    {   [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }
   
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;
   
    [assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType: AVFileTypeQuickTimeMovie
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];
   
    [assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor =  [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                     initWithAssetWriterInput:assetWriterInput
                                     sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];
   
    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
   
    // start writing samples to it
    [assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                        target:self
                                                      selector:@selector (writeSample:)
                                                      userInfo:nil
¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬†¬† repeats:YES] ;
   
}

-(void) stopScreenRecording
{   [assetWriterTimer invalidate];
    assetWriterTimer = nil;
   
    [assetWriter finishWriting];
¬†¬†¬† NSLog (@”finished writing”);
}

As you can see startScreenRecording is calls writeSample.

-(void) writeSample: (NSTimer*) _timer
{   if (assetWriterInput.readyForMoreMediaData)
    {
        CVReturn cvErr = kCVReturnSuccess;
       
        // get screenshot image!
        CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[self screenshot]] CGImage];
       
        // prepare the pixel buffer
        CVPixelBufferRef pixelBuffer = NULL;
        CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
        cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                             FRAME_WIDTH,
                                             FRAME_HEIGHT,
                                             kCVPixelFormatType_32ARGB,
                                             (void*)CFDataGetBytePtr(imageData),
                                             CGImageGetBytesPerRow(image),
                                             NULL,
                                             NULL,
                                             NULL,
                                             &pixelBuffer);
       
        // calculate the time
        CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
        CFTimeInterval elapsedTime = thisFrameWallClockTime РfirstFrameWallClockTime;
¬†¬†¬†¬†¬†¬†¬† //NSLog (@”elapsedTime: %f”, elapsedTime);
        CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
       
        // write the sample
        BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
   
        if (appended)
¬†¬†¬†¬†¬†¬†¬† {¬†¬† NSLog (@”appended sample at time %lf”, CMTimeGetSeconds(presentationTime));
        } else
¬†¬†¬†¬†¬†¬†¬† {¬†¬† NSLog (@”failed to append”);
            [self stopScreenRecording];
        }
    }
}

And the code I used to take screenshot:

– (UIImage*)screenshot
{   return [AWScreenshot takeAsImage];
}

Notice how I called [[self createARGBImageFromRGBAImage: [self screenshot]], it’s because my UIImage is a RGBAImage, while the CVPixelBuffer’s format type is kCVPixelFormatType_32ARGB, so I had to fix thing so they match or else, my video would come up in weird tints.

I found the Googled for the createARGBImageFromRGBAImage code, and here it is:

-(UIImage *) createARGBImageFromRGBAImage: (UIImage*)image
{   CGSize dimensions = [image size];
   
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * dimensions.width;
    NSUInteger bitsPerComponent = 8;
   
    unsigned char *rgba = malloc(bytesPerPixel * dimensions.width * dimensions.height);
    unsigned char *argb = malloc(bytesPerPixel * dimensions.width * dimensions.height);
   
    CGColorSpaceRef colorSpace = NULL;
    CGContextRef context = NULL;
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(rgba, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGContextDrawImage(context, CGRectMake(0, 0, dimensions.width, dimensions.height), [image CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    for (int x = 0; x < dimensions.width; x++) {
        for (int y = 0; y < dimensions.height; y++) {
            NSUInteger offset = ((dimensions.width * y) + x) * bytesPerPixel;
            argb[offset + 0] = rgba[offset + 3];
            argb[offset + 1] = rgba[offset + 0];
            argb[offset + 2] = rgba[offset + 1];
            argb[offset + 3] = rgba[offset + 2];
        }
    }
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(argb, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGImageRef imageRef = CGBitmapContextCreateImage(context);
    image = [UIImage imageWithCGImage: imageRef];
    CGImageRelease(imageRef);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    free(rgba);
    free(argb);
   
    return image;
}

And there we go, I managed to record the screen of my Cocos2d app and then save it as a video file.

My next problem is, how do I add audio to my video?