5. Overlaying with a Live Camera Image
To make this a true augmented reality app, we
need to bring the camera into play. If a camera isn’t available (as in
the simulator), then the app can simply fall back to the “scrolling
clouds” background.
The first step is adding another protocol to
the GLView class—actually we need
two new protocols! Add the bold lines in Example 16, noting the new data fields as well
(m_viewController and
m_cameraSupported).
Example 16. Adding camera support to GLView.h
#import "Interfaces.hpp" #import "AccelerometerFilter.h" #import <UIKit/UIKit.h> #import <QuartzCore/QuartzCore.h> #import <CoreLocation/CoreLocation.h>
@interface GLView : UIView <UIImagePickerControllerDelegate, UINavigationControllerDelegate, CLLocationManagerDelegate, UIAccelerometerDelegate> { @private IRenderingEngine* m_renderingEngine; IResourceManager* m_resourceManager; EAGLContext* m_context; CLLocationManager* m_locationManager; AccelerometerFilter* m_filter; UIViewController* m_viewController; bool m_cameraSupported; ... }
- (void) drawView: (CADisplayLink*) displayLink;
@end
|
Next we need to enhance the
initWithFrame and drawView
methods. See Example 17. Until now, every
sample in this book has set the opaque property in
the EAGL layer to YES. In this sample, we decide its
value at runtime; if a camera is available, don’t make the surface
opaque to allow the image “underlay” to show through.
Example 17. Adding camera support to GLView.mm
- (id) initWithFrame: (CGRect) frame { ...
if (self = [super initWithFrame:frame]) {
m_cameraSupported = [UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera];
CAEAGLLayer* eaglLayer = (CAEAGLLayer*) self.layer; eaglLayer.opaque = !m_cameraSupported; if (m_cameraSupported) NSLog(@"Camera is supported."); else NSLog(@"Camera is NOT supported.");
...
#if TARGET_IPHONE_SIMULATOR BOOL compassSupported = NO; BOOL accelSupported = NO; #else BOOL compassSupported = m_locationManager.headingAvailable; BOOL accelSupported = YES; #endif
m_viewController = 0;
...
m_timestamp = CACurrentMediaTime();
bool opaqueBackground = !m_cameraSupported; m_renderingEngine->Initialize(opaqueBackground);
// Delete the line [self drawView:nil]; CADisplayLink* displayLink; displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(drawView:)];
... } return self; }
- (void) drawView: (CADisplayLink*) displayLink { if (m_cameraSupported && m_viewController == 0) [self createCameraController];
if (m_paused) return; ... m_renderingEngine->Render(m_theta, m_phi, buttonFlags); [m_context presentRenderbuffer:GL_RENDERBUFFER]; }
|
Next we need to implement the
createCameraController method that was called from
drawView. This is an example of lazy
instantiation; we don’t create the camera controller until we
actually need it. Example 18 shows the
method, and a detailed explanation follows the listing. (The
createCameraController method needs to be defined
before the drawView method to avoid a compiler
warning.)
Example 18. Creating the camera view controller
- (void) createCameraController { UIImagePickerController* imagePicker = [[UIImagePickerController alloc] init]; imagePicker.delegate = self; imagePicker.navigationBarHidden = YES; imagePicker.toolbarHidden = YES; imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera; imagePicker.showsCameraControls = NO; imagePicker.cameraOverlayView = self; // The 54 pixel wide empty spot is filled in by scaling the image. // The camera view's height gets stretched from 426 pixels to 480. float bandWidth = 54; float screenHeight = 480; float zoomFactor = screenHeight / (screenHeight - bandWidth); CGAffineTransform pickerTransform = CGAffineTransformMakeScale(zoomFactor, zoomFactor); imagePicker.cameraViewTransform = pickerTransform; m_viewController = [[UIViewController alloc] init]; m_viewController.view = self; [m_viewController presentModalViewController:imagePicker animated:NO]; }
|
Since we’re using the camera API in a way
that’s quite different from how Apple intended, we had to jump through a
few hoops: hiding the UI, stretching the image, and implementing a
protocol that never really gets used. This may seem a bit hacky, but
ideally Apple will improve the camera API in the future to simplify the
development of augmented reality applications.
You may’ve noticed in Example 19 that the view class is now passing
in a boolean to the rendering engine’s Initialize
method; this tells it whether the background should contain clouds as
before or whether it should be cleared to allow the camera underlay to
show through. You must modify the declaration of
Initialize in Interfaces.cpp
accordingly. Next, the only remaining changes are shown in Example 6-38.
Example 19. RenderingEngine modifications to support the camera
“underlay”
...
class RenderingEngine : public IRenderingEngine { public: RenderingEngine(IResourceManager* resourceManager); void Initialize(bool opaqueBackground); void Render(float theta, float phi, ButtonMask buttons) const; private: ... bool m_opaqueBackground; }; void RenderingEngine::Initialize(bool opaqueBackground) { m_opaqueBackground = opaqueBackground;
... }
void RenderingEngine::Render(float theta, float phi, ButtonMask buttons) const { static float frameCounter = 0; frameCounter++; glPushMatrix();
glRotatef(phi, 1, 0, 0); glRotatef(theta, 0, 1, 0);
if (m_opaqueBackground) { glClear(GL_DEPTH_BUFFER_BIT);
glPushMatrix(); glScalef(100, 100, 100); glRotatef(frameCounter * 2, 0, 1, 0); glBindTexture(GL_TEXTURE_2D, m_textures.Sky); RenderDrawable(m_drawables.SkySphere); glPopMatrix(); } else { glClearColor(0, 0, 0, 0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); }
... }
|
Note that the alpha value of the clear color
is zero; this allows the underlay to show through. Also note that the
color buffer is cleared only if there’s no sky sphere. Experienced
OpenGL programmers make little optimizations like this as a matter of
habit.
That’s it for the Holodeck sample! See Figure 6 for a depiction of the app as it now
stands.