TexturePacker gets a BIG FAT update

According to

This is a really big fat update for TexturePacker. I packed 25 new features inside it!

What do you get in detail?

Flash importer

Simply drag your flash files (.swf) onto TexturePacker and it extracts the images and builds a sprite sheet from them! Requires installed Flash Player Plugin.
This feature works on Windows & MacOS – Linux is currently not working well.

.pvr and .pvr.ccz QuickLook plugin on MacOS

No need to open TexturePacker to view your .pvr and .pvr.ccz files anymore. TexturePacker now contains a QuickView plugin so you can view the files right from finder!

Save defaults

Many people requested a way to save the default settings so that they don’t have to set exporter and other stuff on restarting. – Here it is! (Pro only)

Ubuntu support

Since many users build for Android I also added support for Linux – currently Ubuntu 10.x. It might still have some quirks (e.g. Flash does not yet work) but is a good start. I decided to build on Ubuntu since it seem to be the most used platform.

New Exporters

I also added a ton of new exporters such as

User interface improvements

I also added a lot of interface improvements to make work with TexturePacker easier.

  • New layout of the left side panel allowing easier change of exporter
  • Auto update sprite sheet when re-entering TexturePacker
  • Sprite highlighting
  • Reveal in Finder / Explorer
  • Preferences to change checkerboard color and hightlight color
  • New icons
  • Hotkeys
  • Backspace / delete to remove sprites
  • Dialog to select the Flash Player plugin to use
  • Drag your sprites here – to give new users a faster start
  • Allowing free choice of texture data file name
  • Fit button for zoom
  • Folders inside a smart folder are now blue
  • Auto populate file names from other names e.g. save blah.tps also sets blah.png and blah.plist

Image features

Here are some new image export features. Main function is support for flipped PVRs and RGB888 as output format.

  • Option to use flipped PVR files
  • Support for RGB888
  • Unflip flipped pvrs in viewer
  • Allow free size for PVR textures (might not work on all targets)

Other stuff

  • Rotation support for AndEngine
  • Default padding is now 2 – This effects the command line client!
  • Corona exporter uses better module concept (http://blog.anscamobile.com/ 2011/09/a-better-approach-to-external-modules/)

TexturePacker gets a BIG FAT update

Mini Tutorial: How to capture video of iPhone app in Cocos2D? with audio

Okay, so I figured out how to add audio to my video.

In my previous blog post (http://purplelilgirl.tumblr.com/post/10974345146/mini-tutorial-how-to-capture-video-of-iphone-app-in), I managed to take a video of my app and save it into a file. However I am just stringing together screenshots of my app taken at every 0.1 second, so it doesn’t capture the audio.

So I have a different function that is capturing my audio (AVAudioRecorder), and saving that into a file.

Now, to combine the files together. Since iOS 4.1, AVFoundation included this thing called AVMutableComposition, and with that you can make composites of stuff, like combine video and audio files together to make a new video file that has audio.

So code bits (I found bits of the code in StackOverflow):

-(void) processVideo: (NSURL*) videoUrl
{  
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];
   
    AVMutableComposition* mixComposition = [AVMutableComposition composition];
   
    AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
   
    NSError * error = nil;
   
    for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
    {
        NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];
       
        AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
       
        AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID: kCMPersistentTrackID_Invalid];
       
        NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);
       
        CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);
       
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];     
    }
   
   
    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                   ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                    atTime:kCMTimeZero error:nil];
   
    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                          presetName:AVAssetExportPresetPassthrough];  
   
    NSString* videoName = @”export.mov”;
   
    NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
    NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];
   
    if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
    {
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
    }
   
    _assetExport.outputFileType = @”com.apple.quicktime-movie”;
    NSLog(@”file type %@”,_assetExport.outputFileType);
    _assetExport.outputURL = exportUrl;
    _assetExport.shouldOptimizeForNetworkUse = YES;
   
    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (_assetExport.status)
         {
             case AVAssetExportSessionStatusCompleted:
                 //export complete
                 NSLog(@”Export Complete”);
                 //[self uploadToYouTube];
                
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@”Export Failed”);
                 NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export error (see exportSession.error) 
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@”Export Failed”);
                 NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export cancelled 
                 break;
         }
     }];   
}

I have more than one audio file that I want to combine with my video, so I created a array file that contains information for each of the audio files (such as where the file is located and when to play that audio).

And that’s it 🙂 You have a video of your app 🙂 with audio 🙂

Mini Tutorial: How to capture video of iPhone app in Cocos2D?

Someone asked me before if I knew how to do record the screen in Cocos2d as a video. I didn’t know how to record a video, so this guy sent me some codes, but his problem is that his code is recording the screen (taking screenshots) as a UIWindow. So my idea for him was to replace his screenshot code with AWScreenshot (by Manucorporat, search the Cocos2d forums for his code).

And here are the code bits:

#import <AVFoundation/AVFoundation.h>
#import <AVFoundation/AVAssetWriter.h>
#import <CoreVideo/CVPixelBuffer.h>
#import <CoreMedia/CMTime.h>

#import “AWScreenshot.h”

#define FRAME_WIDTH 320
#define FRAME_HEIGHT 480
#define TIME_SCALE 60 // frames per second

-(void) startScreenRecording
{  
    NSLog(@”start screen recording”);
   
    // create the AVAssetWriter
    NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent: @”video.mov”];
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
    {   [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }
   
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;
   
    [assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType: AVFileTypeQuickTimeMovie
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];
   
    [assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor =  [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                     initWithAssetWriterInput:assetWriterInput
                                     sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];
   
    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
   
    // start writing samples to it
    [assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                        target:self
                                                      selector:@selector (writeSample:)
                                                      userInfo:nil
                                                       repeats:YES] ;
   
}

-(void) stopScreenRecording
{   [assetWriterTimer invalidate];
    assetWriterTimer = nil;
   
    [assetWriter finishWriting];
    NSLog (@”finished writing”);
}

As you can see startScreenRecording is calls writeSample.

-(void) writeSample: (NSTimer*) _timer
{   if (assetWriterInput.readyForMoreMediaData)
    {
        CVReturn cvErr = kCVReturnSuccess;
       
        // get screenshot image!
        CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[self screenshot]] CGImage];
       
        // prepare the pixel buffer
        CVPixelBufferRef pixelBuffer = NULL;
        CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
        cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                             FRAME_WIDTH,
                                             FRAME_HEIGHT,
                                             kCVPixelFormatType_32ARGB,
                                             (void*)CFDataGetBytePtr(imageData),
                                             CGImageGetBytesPerRow(image),
                                             NULL,
                                             NULL,
                                             NULL,
                                             &pixelBuffer);
       
        // calculate the time
        CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
        CFTimeInterval elapsedTime = thisFrameWallClockTime – firstFrameWallClockTime;
        //NSLog (@”elapsedTime: %f”, elapsedTime);
        CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
       
        // write the sample
        BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
   
        if (appended)
        {   NSLog (@”appended sample at time %lf”, CMTimeGetSeconds(presentationTime));
        } else
        {   NSLog (@”failed to append”);
            [self stopScreenRecording];
        }
    }
}

And the code I used to take screenshot:

– (UIImage*)screenshot
{   return [AWScreenshot takeAsImage];
}

Notice how I called [[self createARGBImageFromRGBAImage: [self screenshot]], it’s because my UIImage is a RGBAImage, while the CVPixelBuffer’s format type is kCVPixelFormatType_32ARGB, so I had to fix thing so they match or else, my video would come up in weird tints.

I found the Googled for the createARGBImageFromRGBAImage code, and here it is:

-(UIImage *) createARGBImageFromRGBAImage: (UIImage*)image
{   CGSize dimensions = [image size];
   
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * dimensions.width;
    NSUInteger bitsPerComponent = 8;
   
    unsigned char *rgba = malloc(bytesPerPixel * dimensions.width * dimensions.height);
    unsigned char *argb = malloc(bytesPerPixel * dimensions.width * dimensions.height);
   
    CGColorSpaceRef colorSpace = NULL;
    CGContextRef context = NULL;
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(rgba, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGContextDrawImage(context, CGRectMake(0, 0, dimensions.width, dimensions.height), [image CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    for (int x = 0; x < dimensions.width; x++) {
        for (int y = 0; y < dimensions.height; y++) {
            NSUInteger offset = ((dimensions.width * y) + x) * bytesPerPixel;
            argb[offset + 0] = rgba[offset + 3];
            argb[offset + 1] = rgba[offset + 0];
            argb[offset + 2] = rgba[offset + 1];
            argb[offset + 3] = rgba[offset + 2];
        }
    }
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(argb, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGImageRef imageRef = CGBitmapContextCreateImage(context);
    image = [UIImage imageWithCGImage: imageRef];
    CGImageRelease(imageRef);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    free(rgba);
    free(argb);
   
    return image;
}

And there we go, I managed to record the screen of my Cocos2d app and then save it as a video file.

My next problem is, how do I add audio to my video?

Review: iPhone JavaScript Cookbook

iPhone JavaScript Cookbook

https://www.packtpub.com/sites/default/files/imagecache/productview/1086EXP_iPhone%20JavaScript%20Development%20Cookbook_cov.jpg

This book is written by Arturo Fernandez Montoro.

This book is written for web dev guys, I guess, who wants to write native looking web applications for the iPhone. The book introduces a lot of frameworks to make life easier. the frameworks include native looking assets for user interface building, which you can call with just a line of code. This book also teaches the readers how to work with data using from SQL and AJAX.

Since this is a cookbook, it includes a lot of recipes that readers can copy and paste and revise.

If you want to make applications for the iPhone that look like applications for the iPhone, without having to learn Objective C, the iPhone SDK and all, then this book is a great guide for you. 🙂  

Link: http://www.packtpub.com/iphone-javascript-cookbook/book/mid/170811hge054