As a starting point, we’ll
create a 3D environment that includes the aforementioned geodesic dome
with antialiased borders. We’ll also render a mossy ground plane and some
moving clouds in the background. Later we’ll replace the clouds with a
live camera image. Another interesting aspect to this sample is that it’s
designed for landscape mode; see Figure 1.
For rendering the AA lines in the dome, let’s
use a different trick than the one presented in the previous section.
Rather than a filling a texture with a circle, let’s fill it with a
triangle, as shown in Figure 2. By choosing
texture coordinates in the right places (see the hollow circles in the
figure), we’ll be creating a thick border at every triangle.
For controlling the camera, the app should use
the compass and accelerometer APIs to truly qualify as an augmented
reality app. However, initially let’s just show four buttons in a HUD:
touching any button will cause the environment to “scroll.” Horizontal
buttons control azimuth (angle from north);
vertical buttons control altitude (angle above
horizon). These terms may be familiar to you if you’re an astronomy
buff.
Later we’ll replace the azimuth/altitude
buttons with the compass and accelerometer APIs. The benefit of this
approach is that we can easily provide a fallback option if the app
discovers that the compass or accelerometer APIs are not available. This
allows us to gracefully handle three scenarios:
- iPhone Simulator
Show buttons for both azimuth and altitude.
- First- and second-generation iPhones
Show buttons for azimuth; use the accelerometer for
altitude.
- Third-generation iPhones
Hide all buttons; use the accelerometer for altitude and the
compass for azimuth.
In honor of my favorite TV show, the name of
this sample is Holodeck. Without further ado, let’s
begin!
1. Application Skeleton
There’s very little logic required for this app anyway; most of
the heavy footwork is done in the rendering engine. Skipping the
application layer makes life easier when we add support for the
accelerometer, compass, and camera APIs.
Another difference lies in how we handle the
dome geometry. Rather than loading in the vertices from an OBJ file or
generating them at runtime, a Python script generates a C++ header file
with the dome data, as shown in Example 1. This is perhaps the simplest possible way to load
geometry into an OpenGL
application, and some modeling tools can actually export their data as a
C/C++ header file!
Example 1. GeodesicDome.h
const int DomeFaceCount = 2782; const int DomeVertexCount = DomeFaceCount * 3; const float DomeVertices[DomeVertexCount * 5] = { -0.819207, 0.040640, 0.572056, 0.000000, 1.000000,
...
0.859848, -0.065758, 0.506298, 1.000000, 1.000000, };
|
Figure 3 shows
the overall structure of the Holodeck project.
Note that this app has quite a few textures
compared to our previous samples: six PNG files and two compressed PVRTC
files. You can also see from the screenshot that we’ve added a new
property to Info.plist called
UIInterfaceOrientation. Recall that this is a
landscape-only app; if you don’t set this property, you’ll have to
manually rotate the virtual iPhone every time you test it in the
simulator.
Interfaces.hpp is much
the same as in our other sample apps, except that the rendering engine
interface is somewhat unique; see Example 2.
Example 2. Interfaces.hpp for Holodeck
...
enum ButtonFlags { ButtonFlagsShowHorizontal = 1 << 0, ButtonFlagsShowVertical = 1 << 1, ButtonFlagsPressingUp = 1 << 2, ButtonFlagsPressingDown = 1 << 3, ButtonFlagsPressingLeft = 1 << 4, ButtonFlagsPressingRight = 1 << 5, };
typedef unsigned char ButtonMask;
struct IRenderingEngine { virtual void Initialize() = 0; virtual void Render(float theta, float phi, ButtonMask buttons) const = 0; virtual ~IRenderingEngine() {} };
...
|
The new Render method
takes three parameters:
- float theta
Azimuth in degrees. This is the
horizontal angle off east.
- float phi
Altitude in degrees. This is the
vertical angle off the horizon.
- ButtonMask buttons
Bit mask of flags for the HUD.
The idea behind the
buttons mask is that the Objective-C code
(GLView.mm) can determine the capabilities of the
device and whether a button is being pressed, so it sends this
information to the rendering engine as a set of flags.