Open in Instagram

I was a little obsessed with trying out different Photo Captioning apps for a while, until I finally settled on Typic, and then deleted the rest.

What does apps (Overgram, Instaquote and Typic) had in common was that at the end of the day, they all let you share your work in Instagram.

image

I’m just gonna write a really short code bit on how to do that.

Instagram allows apps to interact with their using different iPhone hooks (http://instagram.com/developer/iphone-hooks).

I’m going to use Document Interaction API.

It’s pretty simple, according to the Instagram developer page, you’d need to save your picture with a “.ig” or a “.igo” extension. And it has to be at least 612 pixels, either in width or height, anything less, won’t be accepted by Instagram.

When your picture is opened in Instagram, it’ll go automatically to the Filter screen. That means there’s no crop option, so better if your picture is a square.

So code bits:

In the .h of your ViewController, declare a UIDocumentInteractionController:

@property (strong, nonatomic) UIDocumentInteractionController *documentInteractionController;

And then set it as a UIDocumentInteractionControllerDelegate, like this:

@interface ComicViewController : UIViewController <UIDocumentInteractionControllerDelegate>

And then in the .m of your View Controller you add a button, or whatever that you want to use to trigger “Open in Instagram” that calls this method:

-(void) openInInstagram
{
    NSString *strImagePath = [[NSHomeDirectory() stringByAppendingPathComponent:@”Documents”] stringByAppendingPathComponent: filename];

    NSURL *url = [NSURL fileURLWithPath: strImagePath];

    self.documentInteractionController = [UIDocumentInteractionController interactionControllerWithURL: url];
    [self.documentInteractionController setDelegate:self];

    NSMutableDictionary *annotationDict = [[NSMutableDictionary alloc] init];

    [annotationDict setValue: @”Instagram Caption” forKey: @”InstagramCaption”];

    self.documentInteractionController.UTI = @”com.instagram.photo”;

    self.documentInteractionController.annotation = [[NSDictionary alloc] initWithDictionary: annotationDict];

    [self.documentInteractionController presentOpenInMenuFromRect: CGRectZero inView: self.view animated: YES];
}

You need to pass an url to your Document Interaction Controller, since my file is saved in the Documents folder of my app, so my url looks like that.

So if you want to open your picture in Instagram and any other app that supports opening image files, simply use the file extension “.ig” for your image, but if you only want to open in Instragram, use “.igo”. Also, the UTI I set in the code above is “com.instagram.photo”, if you want it to be exclusive, use “com.instagram.exclusivegram”

You can set the caption you want to appear in Instagram. You can add hash tags in your caption, that works too.

And that’s it, when you click a button that calls the “openInInstagram” method, you’ll be able to see an action sheet similar to that screenshot.

Advertisements

Mini Tutorial: How to post score of your Unity iOS game to Facebook?

-without shelling out $65 😉

Well, if you have $65 to spare, just check out Prime31’s Social Networking plugin (you can even get Twitter!).

Link: http://www.prime31.com/unity/

If you don’t, like 1-broke-girl/me, read on…

There are two parts to this, Unity side and Xcode side. We must find a way for Unity and Xcode to be friends and talk to one another, you know call each others’ functions, access each others’ variables etc.

First let’s take advantage of NSUserDefaults and PlayerPrefs to save some variables (you may encrypt the score variable if you are afraid of cheaters).

PlayerPrefs in Unity…

PlayerPrefs.SetString(“score”, score.ToString());

… can be read in Xcode using…

NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
NSString *score = [defaults objectForKey:@”score”];

There, we know how to pass variables, what about functions…?

In order to do that, I read this tut, which is in Simplified Chinese!  (http://xys289187120.blog.51cto.com/3361352/705415). The gist of that tutorial is that you create this other class (let’s just call it Facebook.cs):

using UnityEngine;  
using System.Runtime.InteropServices;  

public class Facebook : MonoBehaviour {

    [DllImport(“__Internal”)]  
     private static extern void _PressButton0 ();  

     public static void ActivateButton0 ()  
     {  if (Application.platform != RuntimePlatform.OSXEditor)   
        {   _PressButton0 ();  
        }  
     }  
}

The _PressButton0() will actually call some code in Xcode (we’ll get to that).

Someone else in Unity has to call ActivateButton0, a GUI button, perhaps?

if(GUI.Button(new Rect(0, 0, 130, 235), “Facebook”))
{       Facebook.ActivateButton0();
}

So when the player clicks on the GUI button, ActivateButton0() will be called which will in turn call _PressButton0().

But where’s _PressButton0()?

We create a ViewController class in Xcode (let’s just call it MyView.m):

#import “MyView.h”
#import “AppController.h”

@implementation MyView

void _PressButton0()  
{   AppController *appController = (AppController*)[[UIApplication sharedApplication] delegate];

    [appController feedDialogButtonClicked];
}  

@end

So there’s _PressButton0()!

Now let’s do the Facebook related things. Go to Facebook’s Developer site and follow the tutorial: https://developers.facebook.com/docs/mobile/ios/build/

Instead of putting some stuff in ApplicationDidFinishLaunchingWithIOptions… I placed everything in a function I called feedDialogButtonClicked.

– (void) feedDialogButtonClicked {
   
    facebook = [[Facebook alloc] initWithAppId:@”221872691249521” andDelegate:self];
   
    NSUserDefaults *defaults = [NSUserDefaults standardUserDefaults];
    if ([defaults objectForKey:@”FBAccessTokenKey”]
        && [defaults objectForKey:@”FBExpirationDateKey”]) {
        facebook.accessToken = [defaults objectForKey:@”FBAccessTokenKey”];
        facebook.expirationDate = [defaults objectForKey:@”FBExpirationDateKey”];
    }
   
    /**
    if (!)
    {   ;
    }
    **/
   
    [[NSUserDefaults standardUserDefaults] synchronize];
    NSString *level = [defaults objectForKey:@”level”];
    NSString *score = [defaults objectForKey:@”score”];
   
    NSMutableDictionary *params =
    [NSMutableDictionary dictionaryWithObjectsAndKeys:
     [NSString stringWithFormat: @”I just scored %@ in the %@ Level of Maru Penguin!”, score, level], @”name”,
     @”“, @”caption”,
     @”Get Maru Penguin for free in the iTunes Store”, @”description”,
     @”http://itunes.apple.com/tw/app/maru-penguin/id521096937?mt=8&#8221;, @”link”,
     @”http://a2.mzstatic.com/us/r1000/082/Purple/v4/3a/42/b5/3a42b5dc-5452-9dde-cdc8-24e24fb82363/486SkNsbbo2zaF3glfCuo0-temp-upload.iomrjeon.320×480-75.jpg&#8221;, @”picture”,
     nil]; 
    ;
}

I had a little problem with fbDidLogin (the one mentioned in the tutorial), good thing this other tutorial solved it for me: http://ebrentnelson.blogspot.com/2012/02/fbdidlogin-never-calledwhy.html

I ended up commenting out:

/**
    if (!)
    {   ;
    }
    **/

Because I seem to be able to post feeds to my Facebook wall even without it (don’t know why, anyone care to explain?).

Another helpful link: How to include a link in my feed-post using FBConnect from iPhone app? (http://stackoverflow.com/questions/5574433/how-to-include-a-link-in-my-feed-post-using-fbconnect-from-iphone-app) This answer in this post explains stuff that you can include in your Feed Dialog pretty clearly.

And um, I think that is it. That bunch of codes can post your scores from Unity iOS to Facebook.

Now my other problem is, how to add a share link to the feed my app posted? Anyone, help?

Also check out an old blog post of mine about how to post pictures from Cocos2D iPhone to Facebook: http://purplelilgirl.tumblr.com/post/9406805856/howtoaddfacebooktococos2diphone

EDIT:

Since generated feeds don’t get the share button (I Google-d for 2 days and found nothing, at the end of it, it was a Which Avengers are You? quiz that helped me solve my problem) , do you know what I eventually did? I went back to that Cocos2D blog post and did it that way (That’s what the Avengers app did, by the way, I’m Hawkeye :D). I posted a photo of the results screen, lol, which is actually what my boss suggested in the first place. And since when you post photos, you can include captions (no advertising though), so there. Problem solved -ish.

EDIT:

Okay, I am apparently not a very good Googler, since only saw this today: http://forum.unity3d.com/threads/122681-Free-facebook-Plugin-for-Unity-iOS Free, Facebook, Unity, iOS, all the keywords that I’ve been searching for, all along was in the Unity Forums!

.

.

.

By the way, we made an app (the one in the sample), it’s a game and it’s free and it stars a penguin named Maru in search of yummy fishies around the world (so far he only got to Asia)…

Link: http://itunes.apple.com/tw/app/maru-penguin/id521096937?mt=8

Mini Tutorial: How to capture video of iPhone app in Cocos2D? with audio

Okay, so I figured out how to add audio to my video.

In my previous blog post (http://purplelilgirl.tumblr.com/post/10974345146/mini-tutorial-how-to-capture-video-of-iphone-app-in), I managed to take a video of my app and save it into a file. However I am just stringing together screenshots of my app taken at every 0.1 second, so it doesn’t capture the audio.

So I have a different function that is capturing my audio (AVAudioRecorder), and saving that into a file.

Now, to combine the files together. Since iOS 4.1, AVFoundation included this thing called AVMutableComposition, and with that you can make composites of stuff, like combine video and audio files together to make a new video file that has audio.

So code bits (I found bits of the code in StackOverflow):

-(void) processVideo: (NSURL*) videoUrl
{  
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];
   
    AVMutableComposition* mixComposition = [AVMutableComposition composition];
   
    AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
   
    NSError * error = nil;
   
    for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
    {
        NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];
       
        AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
       
        AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID: kCMPersistentTrackID_Invalid];
       
        NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);
       
        CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);
       
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];     
    }
   
   
    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                   ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                    atTime:kCMTimeZero error:nil];
   
    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                          presetName:AVAssetExportPresetPassthrough];  
   
    NSString* videoName = @”export.mov”;
   
    NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
    NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];
   
    if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
    {
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
    }
   
    _assetExport.outputFileType = @”com.apple.quicktime-movie”;
    NSLog(@”file type %@”,_assetExport.outputFileType);
    _assetExport.outputURL = exportUrl;
    _assetExport.shouldOptimizeForNetworkUse = YES;
   
    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (_assetExport.status)
         {
             case AVAssetExportSessionStatusCompleted:
                 //export complete
                 NSLog(@”Export Complete”);
                 //[self uploadToYouTube];
                
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@”Export Failed”);
                 NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export error (see exportSession.error) 
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@”Export Failed”);
                 NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
                 //export cancelled 
                 break;
         }
     }];   
}

I have more than one audio file that I want to combine with my video, so I created a array file that contains information for each of the audio files (such as where the file is located and when to play that audio).

And that’s it 🙂 You have a video of your app 🙂 with audio 🙂

Mini Tutorial: How to capture video of iPhone app in Cocos2D?

Someone asked me before if I knew how to do record the screen in Cocos2d as a video. I didn’t know how to record a video, so this guy sent me some codes, but his problem is that his code is recording the screen (taking screenshots) as a UIWindow. So my idea for him was to replace his screenshot code with AWScreenshot (by Manucorporat, search the Cocos2d forums for his code).

And here are the code bits:

#import <AVFoundation/AVFoundation.h>
#import <AVFoundation/AVAssetWriter.h>
#import <CoreVideo/CVPixelBuffer.h>
#import <CoreMedia/CMTime.h>

#import “AWScreenshot.h”

#define FRAME_WIDTH 320
#define FRAME_HEIGHT 480
#define TIME_SCALE 60 // frames per second

-(void) startScreenRecording
{  
    NSLog(@”start screen recording”);
   
    // create the AVAssetWriter
    NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent: @”video.mov”];
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
    {   [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }
   
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;
   
    [assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType: AVFileTypeQuickTimeMovie
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];
   
    [assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor =  [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                     initWithAssetWriterInput:assetWriterInput
                                     sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];
   
    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
   
    // start writing samples to it
    [assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                        target:self
                                                      selector:@selector (writeSample:)
                                                      userInfo:nil
                                                       repeats:YES] ;
   
}

-(void) stopScreenRecording
{   [assetWriterTimer invalidate];
    assetWriterTimer = nil;
   
    [assetWriter finishWriting];
    NSLog (@”finished writing”);
}

As you can see startScreenRecording is calls writeSample.

-(void) writeSample: (NSTimer*) _timer
{   if (assetWriterInput.readyForMoreMediaData)
    {
        CVReturn cvErr = kCVReturnSuccess;
       
        // get screenshot image!
        CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[self screenshot]] CGImage];
       
        // prepare the pixel buffer
        CVPixelBufferRef pixelBuffer = NULL;
        CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
        cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                             FRAME_WIDTH,
                                             FRAME_HEIGHT,
                                             kCVPixelFormatType_32ARGB,
                                             (void*)CFDataGetBytePtr(imageData),
                                             CGImageGetBytesPerRow(image),
                                             NULL,
                                             NULL,
                                             NULL,
                                             &pixelBuffer);
       
        // calculate the time
        CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
        CFTimeInterval elapsedTime = thisFrameWallClockTime – firstFrameWallClockTime;
        //NSLog (@”elapsedTime: %f”, elapsedTime);
        CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
       
        // write the sample
        BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
   
        if (appended)
        {   NSLog (@”appended sample at time %lf”, CMTimeGetSeconds(presentationTime));
        } else
        {   NSLog (@”failed to append”);
            [self stopScreenRecording];
        }
    }
}

And the code I used to take screenshot:

– (UIImage*)screenshot
{   return [AWScreenshot takeAsImage];
}

Notice how I called [[self createARGBImageFromRGBAImage: [self screenshot]], it’s because my UIImage is a RGBAImage, while the CVPixelBuffer’s format type is kCVPixelFormatType_32ARGB, so I had to fix thing so they match or else, my video would come up in weird tints.

I found the Googled for the createARGBImageFromRGBAImage code, and here it is:

-(UIImage *) createARGBImageFromRGBAImage: (UIImage*)image
{   CGSize dimensions = [image size];
   
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * dimensions.width;
    NSUInteger bitsPerComponent = 8;
   
    unsigned char *rgba = malloc(bytesPerPixel * dimensions.width * dimensions.height);
    unsigned char *argb = malloc(bytesPerPixel * dimensions.width * dimensions.height);
   
    CGColorSpaceRef colorSpace = NULL;
    CGContextRef context = NULL;
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(rgba, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGContextDrawImage(context, CGRectMake(0, 0, dimensions.width, dimensions.height), [image CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    for (int x = 0; x < dimensions.width; x++) {
        for (int y = 0; y < dimensions.height; y++) {
            NSUInteger offset = ((dimensions.width * y) + x) * bytesPerPixel;
            argb[offset + 0] = rgba[offset + 3];
            argb[offset + 1] = rgba[offset + 0];
            argb[offset + 2] = rgba[offset + 1];
            argb[offset + 3] = rgba[offset + 2];
        }
    }
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(argb, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGImageRef imageRef = CGBitmapContextCreateImage(context);
    image = [UIImage imageWithCGImage: imageRef];
    CGImageRelease(imageRef);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    free(rgba);
    free(argb);
   
    return image;
}

And there we go, I managed to record the screen of my Cocos2d app and then save it as a video file.

My next problem is, how do I add audio to my video?

How to make fancy labels using Cocos2D?

UPDATE:

It’s been 2 years, and I am using Cocos2d again, and I discovered that the old Hiero binary seems to be buggy in OSX Lion, anyway, found  a newer build : http://www.cocos2d-iphone.org/forum/topic/220/page/2#post-145909.

—-

A very short tutorial.

I’m reading Cocos2D for iPhone Beginner’s Guide, and I’m reading Chapter 4 right now, called Pasting Labels, because I’m working on my game’s HUD. According to the book, there are three ways for displaying text in Cocos2D, first is CCLabel (which according to the author, is not very efficient, also it only supports the iOS fonts, which are Arial, Helvetica… which are not very much), and then there’s CCLabelAtlas (which is getting the characters from an image, you can do a lot of fancy effects using this, but the fonts has to be fixed width), and then there’s option number 3, which is CCBitmapFontAtlas.

The book introduced the Hiero Bitmap Font Tool for making CCBitmapFontAtlas (download link: http://slick.cokeandcode.com/demos/hiero.jnlp). So basically, I just followed the instructions in the book: download the software, run the software…

image

The book gave examples on how to use the Effects in Hiero, but I don’t find it sufficient, so I’m going to leave it plain, but I will add a bit of padding on the side (3, 3, 3, 3).

Go to File-> Save BMP font files…

It will save a .fnt and a .png. For some odd reason my .pngs are always inverted, so I open it in Photoshop…

image

I added the background, so you can see my text clearer. See, it’s inverted, so I just go to Image-> Image Rotation-> Flip Canvas Vertical.

The next step is double click on the Layer of the text or go to Layer-> Layer Style-> …

And add some fancy layer effects to your text. I only added Drop Shadow, but you can all sorts of fancy stuff, like Outer Glow, Bevel…

image

And then your text will look like this:

image

Save your file.

Drag the edited .png file and the .fnt file to your XCode project.

Some code for the CCBitmapFontAtlas:

CCBitmapFontAtlas * lblHighScoreTitle = [CCBitmapFontAtlas bitmapFontAtlasWithString:@”HIGH SCORE: ” fntFile:@”helveticaCY.fnt”];

And voila! You’re good to go!