After starting the sub-series focused on iOS, I held off completing it until I could actually test the code on a physical iOS device. A big thanks to Andy Chang from our Toronto office for getting me set up with the ADP membership and my iPad 2 added to the list of usable development devices.
I won’t talk about the steps needed to provision apps for iOS devices – there seems to be enough information available on the web for that – but I will say it ended up being less complicated than I expected. That’s not to say it’s easy to package things up for posting to the App Store – I haven’t gone through that, myself – but just deploying the app to a physical device for testing was reasonably straightforward.
That said, there was some non-trivial work needed to get an app working on both the iOS simulator and a physical device, mainly due to our use of the iSGL3D. The issue came, in this case, from the fact that iSGL3D has a problem with ARC (Automatic Reference Counting). The path of least resistance ended up being to build it into a universal, static library, which could then be used to build an iOS app that targets both the iOS Simulator (which is i386-based, as it works on the Mac) and the physical, ARM-based iOS device.
I’m still not very happy with the lighting I was able to get working using iSGL3D: the “out-of-the-box” lighting capabilities – presumably built into the base material types – were much nicer with Three.js or Rajawali. I did my best by reading up on iSGL3D lighting, OpenGL lighting and even going back to basics on diffuse, ambient and specular lighting, but the results weren’t as good as I’d have liked. This is probably down to me, rather than the iSGL3D framework, per se, but still.
Ah yes, and I also integrated some code to determine the size of the screen depending on the orientation – mainly so I could implement a tap-based UI for changing level (tap at the top: go up a level, tap at the bottom: go down a level).
Here’s the latest version in action on the iOS simulator (which turns out to be comparable to the physical device, graphics-wise, even if the framerate is indeed much lower):
Here’s the updated source project and the updated Objective-C code for the main implementation file:
//
// ApollonianViewer.m
// Apollonian Viewer
//
// Created by Kean on 5/4/12.
// Copyright 2012 Autodesk. All rights reserved.
//
#import "ApollonianViewer.h"
#import "UIApplication+AppDimensions.h"
@implementation ApollonianViewer
// Our level number
int _level = 5;
int minLevel = 1;
int maxLevel = 10;
bool _accessing = false;
// Our data member for the received data
NSMutableData * _receivedData = NULL;
// A response has been received from our web-service call
- (void)connection:(NSURLConnection *)connection
didReceiveResponse:(NSURLResponse *)response
{
// Initialise our member variable receiving data
if (_receivedData == NULL)
_receivedData = [[NSMutableData alloc] init];
else
[_receivedData setLength:0];
}
// Data has been received from our web-service call
- (void)connection:(NSURLConnection *)connection
didReceiveData:(NSData *)data
{
// Append the received data to our member
[_receivedData appendData:data];
}
// The web-service connection failed
- (void)connection:(NSURLConnection *)connection
didFailWithError:(NSError *)error
{
// Report an error in the log
NSLog(@"Connection failed: %@", [error description]);
UIAlertView* alert =
[[[UIAlertView alloc]
initWithTitle:@"Apollonian Viewer"
message:
@"Unable to access the web-service. "
"Please check you have internet connectivity."
delegate:self
cancelButtonTitle:@"Close"
otherButtonTitles:nil
] autorelease];
[alert show];
_accessing = false;
}
- (void)alertView
:(UIAlertView *)alertView
clickedButtonAtIndex:(NSInteger)buttonIndex
{
if (buttonIndex == 0) exit (0);
}
- (void)extract_spheres
:(NSString *)responseString
onlyOuter: (Boolean)outer
{
// Extract JSON data from our response string
NSData * jsonData =
[responseString
dataUsingEncoding:NSUTF8StringEncoding];
// Extract an array from our JSON data
NSError * e = nil;
NSArray * jsonArray =
[NSJSONSerialization
JSONObjectWithData: jsonData
options: NSJSONReadingMutableContainers
error: &e
];
if (!jsonArray)
{
NSLog(@"Error parsing JSON: %@", e);
}
else
{
// Loop through our JSON array, extracting spheres
for (NSDictionary *item in jsonArray)
{
// We'll need this data for each sphere
double x, y, z, radius;
int level;
// We use a single NSNumber to extract the data
NSNumber *num;
num = [item objectForKey:@"X"];
x = [num doubleValue];
num = [item objectForKey:@"Y"];
y = [num doubleValue];
num = [item objectForKey:@"Z"];
z = [num doubleValue];
num = [item objectForKey:@"R"];
radius = [num doubleValue];
num = [item objectForKey:@"L"];
level = [num intValue];
// Only create spheres for those at the edge of the
// outer sphere
double length = sqrt(x*x + y*y + z*z);
if (!outer || (length + radius > 0.99f))
[self createSphere:radius x:x y:y z:z level:level];
}
// Trigger the rotation updates
[self schedule:@selector(tick:)];
}
}
// The call to our web-service has completed
- (void)connectionDidFinishLoading
:(NSURLConnection *)connection
{
// Release the connection
[connection release];
// Get the response string from our data member then
// release it
NSString * responseString =
[[NSString alloc]
initWithData:_receivedData
encoding:NSUTF8StringEncoding
];
[_receivedData release];
[self extract_spheres:responseString onlyOuter:true];
_accessing = false;
}
- (void)ask_for_spheres:(int)level
{
if (_accessing)
return;
_accessing = true;
// First we clear any existing spheres
[_container clearAll];
// Set up our web-service call
NSURL * url =
[NSURL
URLWithString:
[NSString
stringWithFormat:
@"http://apollonian.cloudapp.net/api/spheres/1/%d",
level
]
];
NSMutableURLRequest *request =
[NSMutableURLRequest
requestWithURL:url
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:60.0
];
[request setHTTPMethod:@"GET"];
NSURLConnection * connection =
[[NSURLConnection alloc]initWithRequest:request delegate:self];
if (connection)
{
_receivedData = [[NSMutableData data] retain];
}
}
// Our main scene initialization method
- (id) init
{
if ((self = [super init]))
{
_preSpin = true;
_paused = false;
_rotation = 0.0;
_initialSpinAmount = 2;
// Create recognizers handling scene-level gestures
// Tap
_tapGestureRecognizer =
[[UITapGestureRecognizer alloc]
initWithTarget:self
action:@selector(tapGesture:)
];
_tapGestureRecognizer.delegate = self;
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_tapGestureRecognizer
forNode:nil
];
// Swipe
// (Add a recognizer for each of 4 directions)
_swipeLeftGestureRecognizer =
[[UISwipeGestureRecognizer alloc]
initWithTarget:self
action:@selector(swipeGesture:)
];
_swipeLeftGestureRecognizer.delegate = self;
[_swipeLeftGestureRecognizer
setDirection:UISwipeGestureRecognizerDirectionLeft
];
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_swipeLeftGestureRecognizer
forNode:nil
];
_swipeRightGestureRecognizer =
[[UISwipeGestureRecognizer alloc]
initWithTarget:self
action:@selector(swipeGesture:)
];
_swipeRightGestureRecognizer.delegate = self;
[_swipeRightGestureRecognizer
setDirection:UISwipeGestureRecognizerDirectionRight
];
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_swipeRightGestureRecognizer
forNode:nil
];
_swipeUpGestureRecognizer =
[[UISwipeGestureRecognizer alloc]
initWithTarget:self
action:@selector(swipeGesture:)
];
_swipeUpGestureRecognizer.delegate = self;
[_swipeUpGestureRecognizer
setDirection:UISwipeGestureRecognizerDirectionUp
];
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_swipeUpGestureRecognizer
forNode:nil
];
_swipeDownGestureRecognizer =
[[UISwipeGestureRecognizer alloc]
initWithTarget:self
action:@selector(swipeGesture:)
];
_swipeDownGestureRecognizer.delegate = self;
[_swipeDownGestureRecognizer
setDirection:UISwipeGestureRecognizerDirectionDown
];
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_swipeDownGestureRecognizer
forNode:nil
];
// Pinch
_pinchGestureRecognizer =
[[UIPinchGestureRecognizer alloc]
initWithTarget:self
action:@selector(pinchGesture:)
];
_pinchGestureRecognizer.delegate = self;
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_pinchGestureRecognizer
forNode:nil
];
// Rotate
_rotationGestureRecognizer =
[[UIRotationGestureRecognizer alloc]
initWithTarget:self
action:@selector(rotationGesture:)
];
_rotationGestureRecognizer.delegate = self;
[[Isgl3dDirector sharedInstance]
addGestureRecognizer:_rotationGestureRecognizer
forNode:nil
];
[self ask_for_spheres:_level];
// Move the default camera to the initial position
[self.camera setPosition:iv3(0, 0, -4)];
// Create a container for our spheres
_container = [self.scene createNode];
// We'll maintain an array of materials for our
// levels. Define the colors for those levels
NSArray * colors =
[NSArray arrayWithObjects:
/* white */ @"FFFFFF",
/* red */ @"FF0000",
/* yellow */ @"FFFF00",
/* green */ @"00FF00",
/* cyan */ @"00FFFF",
/* blue */ @"0000FF",
/* magenta */ @"FF00FF",
/* dark gray */ @"A9A9A9",
/* gray */ @"808080",
/* light gray */ @"D3D3D3",
/* white */ @"FFFFFF", nil
];
// Create and populate the array of materials
_materials = [[NSMutableArray alloc] init];
for (int i=0; i < 12; i++)
{
// Anything we don't have a color for will be white
NSString *col =
(i <= 10) ? [colors objectAtIndex:i] : @"FFFFFF";
// We have two entries per material - use one for diffuse
// and specular, th eother for ambient
Isgl3dColorMaterial * mat =
[[Isgl3dColorMaterial alloc]
initWithHexColors: col
diffuse: @"222222"
specular: col
shininess:0.1
];
[_materials addObject:mat];
}
// Create a single sphere mesh
_sphereMesh =
[[Isgl3dSphere alloc] initWithGeometry:1 longs:16 lats:16];
float leftColors[3] = { 0.1, 0.4, 0.1 };
float rightColors[3] = { 0.4, 0.1, 0.4 };
double d = -7;
Isgl3dLight * left =
[Isgl3dLight lightWithColorArray:leftColors];
left.position = iv3(-d/3,d,-d);
left.lightType = DirectionalLight;
[left setDirection:1 y:-1 z:1];
[self.scene addChild:left];
Isgl3dLight * right =
[Isgl3dLight lightWithColorArray:rightColors];
right.position = iv3(d,d,d*3);
right.lightType = DirectionalLight;
[right setDirection:1 y:-1 z:1];
[self.scene addChild:right];
// Set the scene's ambient color
[self setSceneAmbient:@"444444"];
}
return self;
}
// Create a single sphere at the desired position with
// the desired radius and level
- (void)createSphere
:(double)radius
x:(double)x y:(double)y z:(double)z
level:(int)level
{
// Create the sphere based on our single mesh
Isgl3dMeshNode * sphere =
[_container
createNodeWithMesh:_sphereMesh
andMaterial:[_materials objectAtIndex:level]
];
// Position and scale it
sphere.position = iv3(x, y, z);
[sphere setScale:radius];
}
- (void) dealloc
{
// Make sure we release our materials and sphere mesh
[_materials release];
[_sphereMesh release];
[super dealloc];
}
// Respond to our timer tick by rotating the model
- (void) tick:(float)dt
{
// Rotate around the appropriate axis
if (!_paused && !_preSpin)
{
// Reset any rotation around Z, first
if (abs(_container.rotationZ) > 0)
{
_container.rotationZ = 0;
_rotation = 0;
}
if (_spinAroundY)
{
// If spinning around Y, reset any X rotation
if (abs(_container.rotationX) > 0)
_container.rotationX = 0;
_container.rotationY += _spinIncrement;
}
else
{
// If spinning around X, reset any Y rotation
if (abs(_container.rotationY) > 0)
_container.rotationY = 0;
_container.rotationX += _spinIncrement;
}
}
}
// Method to specify combination of gesture recognizers
- (BOOL)gestureRecognizer:
(UIGestureRecognizer *)gestureRecognizer
shouldRecognizeSimultaneouslyWithGestureRecognizer:
(UIGestureRecognizer *)otherGestureRecognizer
{
// If the gesture recognizers are on different views,
// don't allow simultaneous recognition
if (gestureRecognizer.view != otherGestureRecognizer.view)
return NO;
// Also stop combination of rotation with other gestures
if ((gestureRecognizer == _rotationGestureRecognizer) ||
(otherGestureRecognizer == _rotationGestureRecognizer))
return NO;
return YES;
}
// Action methods for our gestures
// Tap-pause/play
- (void)tapGesture
:(UITapGestureRecognizer *)gestureRecognizer
{
// If the tap happens near the top or bottom edge,
// change level
int maxDistFromEdge = 50;
CGPoint tapPoint =
[gestureRecognizer locationInView:gestureRecognizer.view];
CGSize screenSize = [UIApplication currentSize];
if (tapPoint.y < maxDistFromEdge)
{
if (_level < maxLevel)
[self ask_for_spheres:++_level];
}
else if (tapPoint.y > screenSize.height - maxDistFromEdge)
{
if (_level > minLevel)
[self ask_for_spheres:--_level];
}
else
{
// Toggle pause (if we have already had at least one spin)
if (!_preSpin)
_paused = !_paused;
}
}
// Swipe-spin
- (void)swipeGesture
:(UISwipeGestureRecognizer *)gestureRecognizer
{
switch(gestureRecognizer.direction)
{
case UISwipeGestureRecognizerDirectionDown:
if (
_preSpin || _spinAroundY ||
(!_spinAroundY && _spinIncrement > 0)
)
{
// Reset the axis and spin amount
_spinAroundY = false;
_spinIncrement = -_initialSpinAmount;
}
else
{
// Speed up the rate of spin
if (abs(_spinIncrement * 2) < 10)
_spinIncrement *= 2;
}
_preSpin = false;
_paused = false;
break;
case UISwipeGestureRecognizerDirectionUp:
if (
_preSpin || _spinAroundY ||
(!_spinAroundY && _spinIncrement < 0)
)
{
// Reset the axis and spin amount
_spinAroundY = false;
_spinIncrement = _initialSpinAmount;
}
else
{
// Speed up the rate of spin
if (abs(_spinIncrement * 2) < 10)
_spinIncrement *= 2;
}
_preSpin = false;
_paused = false;
break;
case UISwipeGestureRecognizerDirectionLeft:
if (
_preSpin || !_spinAroundY ||
(_spinAroundY && _spinIncrement > 0)
)
{
// Reset the axis and spin amount
_spinAroundY = true;
_spinIncrement = -_initialSpinAmount;
}
else
{
// Speed up the rate of spin
if (abs(_spinIncrement * 2) < 10)
_spinIncrement *= 2;
}
_preSpin = false;
_paused = false;
break;
case UISwipeGestureRecognizerDirectionRight:
if (
_preSpin || !_spinAroundY ||
(_spinAroundY && _spinIncrement < 0)
)
{
// Reset the axis and spin amount
_spinAroundY = true;
_spinIncrement = _initialSpinAmount;
}
else
{
// Speed up the rate of spin
if (abs(_spinIncrement * 2) < 10)
_spinIncrement *= 2;
}
_preSpin = false;
_paused = false;
break;
default:
break;
}
}
// Pinch-zoom
- (void)pinchGesture
:(UIPinchGestureRecognizer *)gestureRecognizer
{
if (
[gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged
)
{
// Adjust the camera position based on the zoom scale
[self.camera setZ:self.camera.z * (1/gestureRecognizer.scale)];
[gestureRecognizer setScale:1];
}
}
// Rotate-rotate :-)
- (void)rotationGesture
:(UIRotationGestureRecognizer *)gestureRecognizer
{
if (
[gestureRecognizer state] == UIGestureRecognizerStateBegan ||
[gestureRecognizer state] == UIGestureRecognizerStateChanged
)
{
// Adjust the rotation around Z based on the rotation amount
if (_paused || _preSpin)
{
_rotation += (gestureRecognizer.rotation * 180.0 / M_PI);
[_container setRotationZ:_rotation];
[gestureRecognizer setRotation:0];
}
}
}
@end
That's it for the sub-series on iOS (for now, at least) and for the overall series on cloud & mobile (from a technical perspective). In the next post, we’ll take a look back at the series and comment on the journey we’ve been on for the last month or so.