MULTIMEDIA

iPhone 3D Programming : Holodeck Sample (part 5) - Overlaying with a Live Camera Image

4/2/2011 3:20:02 PM

5. Overlaying with a Live Camera Image

To make this a true augmented reality app, we need to bring the camera into play. If a camera isn’t available (as in the simulator), then the app can simply fall back to the “scrolling clouds” background.

The first step is adding another protocol to the GLView class—actually we need two new protocols! Add the bold lines in Example 16, noting the new data fields as well (m_viewController and m_cameraSupported).

Example 16. Adding camera support to GLView.h
#import "Interfaces.hpp"
#import "AccelerometerFilter.h"
#import <UIKit/UIKit.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreLocation/CoreLocation.h>

@interface GLView : UIView <UIImagePickerControllerDelegate,
UINavigationControllerDelegate,
CLLocationManagerDelegate,
UIAccelerometerDelegate> {
@private
IRenderingEngine* m_renderingEngine;
IResourceManager* m_resourceManager;
EAGLContext* m_context;
CLLocationManager* m_locationManager;
AccelerometerFilter* m_filter;
UIViewController* m_viewController;
bool m_cameraSupported;
...
}

- (void) drawView: (CADisplayLink*) displayLink;

@end

Next we need to enhance the initWithFrame and drawView methods. See Example 17. Until now, every sample in this book has set the opaque property in the EAGL layer to YES. In this sample, we decide its value at runtime; if a camera is available, don’t make the surface opaque to allow the image “underlay” to show through.

Example 17. Adding camera support to GLView.mm
- (id) initWithFrame: (CGRect) frame
{
...

if (self = [super initWithFrame:frame]) {

m_cameraSupported = [UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypeCamera];

CAEAGLLayer* eaglLayer = (CAEAGLLayer*) self.layer;
eaglLayer.opaque = !m_cameraSupported;
if (m_cameraSupported)
NSLog(@"Camera is supported.");
else
NSLog(@"Camera is NOT supported.");

...

#if TARGET_IPHONE_SIMULATOR
BOOL compassSupported = NO;
BOOL accelSupported = NO;
#else
BOOL compassSupported = m_locationManager.headingAvailable;
BOOL accelSupported = YES;
#endif

m_viewController = 0;

...

m_timestamp = CACurrentMediaTime();

bool opaqueBackground = !m_cameraSupported;
m_renderingEngine->Initialize(opaqueBackground);

// Delete the line [self drawView:nil];

CADisplayLink* displayLink;
displayLink = [CADisplayLink displayLinkWithTarget:self
selector:@selector(drawView:)];

...
}
return self;
}

- (void) drawView: (CADisplayLink*) displayLink
{
if (m_cameraSupported && m_viewController == 0)
[self createCameraController];

if (m_paused)
return;

...

m_renderingEngine->Render(m_theta, m_phi, buttonFlags);
[m_context presentRenderbuffer:GL_RENDERBUFFER];
}




Next we need to implement the createCameraController method that was called from drawView. This is an example of lazy instantiation; we don’t create the camera controller until we actually need it. Example 18 shows the method, and a detailed explanation follows the listing. (The createCameraController method needs to be defined before the drawView method to avoid a compiler warning.)

Example 18. Creating the camera view controller
- (void) createCameraController
{
UIImagePickerController* imagePicker =
[[UIImagePickerController alloc] init];
imagePicker.delegate = self;
imagePicker.navigationBarHidden = YES;
imagePicker.toolbarHidden = YES;
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.showsCameraControls = NO;
imagePicker.cameraOverlayView = self;

// The 54 pixel wide empty spot is filled in by scaling the image.
// The camera view's height gets stretched from 426 pixels to 480.

float bandWidth = 54;
float screenHeight = 480;
float zoomFactor = screenHeight / (screenHeight - bandWidth);

CGAffineTransform pickerTransform =
CGAffineTransformMakeScale(zoomFactor, zoomFactor);
imagePicker.cameraViewTransform = pickerTransform;

m_viewController = [[UIViewController alloc] init];
m_viewController.view = self;
[m_viewController presentModalViewController:imagePicker animated:NO];
}



Since we’re using the camera API in a way that’s quite different from how Apple intended, we had to jump through a few hoops: hiding the UI, stretching the image, and implementing a protocol that never really gets used. This may seem a bit hacky, but ideally Apple will improve the camera API in the future to simplify the development of augmented reality applications.

You may’ve noticed in Example 19 that the view class is now passing in a boolean to the rendering engine’s Initialize method; this tells it whether the background should contain clouds as before or whether it should be cleared to allow the camera underlay to show through. You must modify the declaration of Initialize in Interfaces.cpp accordingly. Next, the only remaining changes are shown in Example 6-38.

Example 19. RenderingEngine modifications to support the camera “underlay”
...

class RenderingEngine : public IRenderingEngine {
public:
RenderingEngine(IResourceManager* resourceManager);
void Initialize(bool opaqueBackground);
void Render(float theta, float phi, ButtonMask buttons) const;
private:
...
bool m_opaqueBackground;
};

void RenderingEngine::Initialize(bool opaqueBackground)
{
m_opaqueBackground = opaqueBackground;

...
}

void RenderingEngine::Render(float theta, float phi, ButtonMask buttons) const
{
static float frameCounter = 0;
frameCounter++;

glPushMatrix();

glRotatef(phi, 1, 0, 0);
glRotatef(theta, 0, 1, 0);

if (m_opaqueBackground) {
glClear(GL_DEPTH_BUFFER_BIT);

glPushMatrix();
glScalef(100, 100, 100);
glRotatef(frameCounter * 2, 0, 1, 0);
glBindTexture(GL_TEXTURE_2D, m_textures.Sky);
RenderDrawable(m_drawables.SkySphere);
glPopMatrix();
} else {
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
}

...
}




Note that the alpha value of the clear color is zero; this allows the underlay to show through. Also note that the color buffer is cleared only if there’s no sky sphere. Experienced OpenGL programmers make little optimizations like this as a matter of habit.

That’s it for the Holodeck sample! See Figure 6 for a depiction of the app as it now stands.

Figure 6. Holodeck with camera underlay


Other  
  •  Building LOB Applications : Printing in a Silverlight LOB Application
  •  Building LOB Applications : Data Validation through Data Annotation
  •  Building LOB Applications : Implementing CRUD Operations in RIA Services
  •  Microsoft XNA Game Studio 3.0 : Displaying Images - Resources and Content (part 2) - Adding Resources to a Project
  •  Microsoft XNA Game Studio 3.0 : Displaying Images - Resources and Content (part 1)
  •  iPhone 3D Programming : Blending and Augmented Reality - Rendering Anti-Aliased Lines with Textures
  •  Programming with DirectX : Game Math - Bounding Geometry (part 2) - Bounding Spheres & Bounding Hierarchies
  •  Programming with DirectX : Game Math - Bounding Geometry (part 1) - Bounding Boxes
  •  Programming with DirectX : Game Math - Matrices
  •  iPhone 3D Programming : Anti-Aliasing Tricks with Offscreen FBOs (part 2) - Jittering
  •  iPhone 3D Programming : Anti-Aliasing Tricks with Offscreen FBOs (part 1) - A Super Simple Sample App for Supersampling
  •  Building LOB Applications : Navigating RIA LOB Data
  •  Building LOB Applications : Databinding in XAML
  •  Microsoft XNA Game Studio 3.0 : Program Bugs
  •  Microsoft XNA Game Studio 3.0 : Getting Player Input - Adding Vibration
  •  Microsoft XNA Game Studio 3.0 : Getting Player Input - Using the Keyboard
  •  iPhone 3D Programming : Blending and Augmented Reality - Stencil Alternatives for Older iPhones
  •  iPhone 3D Programming : Blending and Augmented Reality - Poor Man’s Reflection with the Stencil Buffer
  •  Microsoft XNA Game Studio 3.0 : Getting Player Input - Reading a Gamepad
  •  iPhone 3D Programming : Blending and Augmented Reality - Shifting Texture Color with Per-Vertex Color
  •  
    Top 10
    How To Shoot In Multiple Exposure With Canon 5D Mark III
    Map, Track And Explore
    Video Goes Personal (Part 2) : Harman/Kardon CL, Micromax LED42K316, BENQ LR100
    Video Goes Personal (Part 1) : Sony HMZ-T2, Epson Moverio Bt-100, Sennheiser Momentum, LG ND8520
    Music And More To Go
    Pebble Smartwatch
    SharePoint 2010 : Searching Through the API - Creating a Custom Search Application Page
    SharePoint 2010 : Searching Through the API - Creating SQL for the FullTextSqlQuery
    ASP.NET 4 : Web Site Navigation (part 4) - Security Trimming, URL Mapping, URL Rewriting
    ASP.NET 4 : Web Site Navigation (part 3) - Trapping the SiteMapResolve Event, Defining Custom Attributes for Each Node
    Most View
    Programmatic Security (part 3) - Permission Attributes
    Video Message via Mail Gets Easy
    Microsoft ASP.NET 4 : .NET Configuration
    Sony NEX-F3 Review (Part 3)
    100 Ways To Speed Up Windows (Part 3)
    Windows Server 2003 : Moving from Workgroups to Domain Environments (part 2) - Configuring Sites
    OpenGL on OS X : OpenGL with Cocoa (part 1) - Creating a Cocoa Program
    Transact-SQL in SQL Server 2008 : Table-Valued Parameters
    Exchange Server 2010 : Installing OCS 2007 R2 (part 3) - Configuring Prerequisites & Deploying an OCS 2007 Server
    The Ubuntu Server Project (Part 1)
    Advice Centre by Photography Experts (Part 6) - Effective AF points
    SQL Server 2008 : Configuration Options
    Algorithms for Compiler Design: STACK ALLOCATION
    Windows 8: The Official Review (Part 3)
    AMD Radeon HD 7950 3GB vs. Nvidia GeForce GTX 660 Ti 2GB vs. Nvidia GeForce GTX 670 2GB (Part 1)
    iPad SDK : Popovers - The Font Name Popover (part 2)
    Toshiba Qosmio X870 - Gaming Powerhouse
    IIS 7.0 : Troubleshooting - Using Tools and Utilities
    The ASP.NET AJAX Infrastructure
    Html 5 : Text Tags and a Little CSS3 - The Fundamentals