Codea Talk 2015 Christmas Competition entry
Trench Run

This was my entry for the Codea Talk 2015 Christmas Competition, which had a Star Wars theme to celebrate the launch of The Force Awakens. It only took a day or two to put together, as I already had the .OBJ importer code and the wireframe shader ready, and the models all came from BlendSwap (please see credits below), and had fairly minimal processing in Blender before exporting them as .OBJ files.

Animating a 3D Blender model in Codea - part 6
Toon redux

Over on the Codea Talk forum, we begun wondering whether it might be possible to do the toon shader in just a single pass. If you can halve the number of draw calls you make, you will significantly speed up your code. I went back to my original inspiration for the shader, this GLSL Programming e-Book, and found that after the description of the toon shader that ships with Unity (which I adapted in the previous post), they do indeed describe a single-pass method.

Animating a 3D Blender model in Codea - part 5
Toon Shaders

Who’s up for a ‘toon shader? 3D graphics on a computer screen is of course an illusion of depth on a flat surface. Sometimes though, it’s fun, or useful, to lay that illusion bare: to prod the user by pointing out that the painstakingly rendered 3D scene they’re looking at is in fact, a depthless image. Or, perhaps you want to integrate 3D elements in with hand-drawn 2D-style elements, without the former looking too far out of place. Whatever the reason, we see many many 3D visualisations that eschew photorealistic rendering in favour of something a little more, well, flat.

Animating a 3D Blender model in Codea - part 4
Managing multiple animations

In the previous posts in this series, we’ve looked at getting an animated model out of Blender, and then recreating that animation in Codea by interpolating between a set of keyframes with a Catmull-Rom spline in the vertex shader. We ended up with some code that loaded up a model and then animated a continuous walking loop cycle. Chances are though, that for most of the objects in the game, we’re not just going to want them to animate in a continuous loop; rather we’ll want them to respond to various inputs and triggers around them, and respond accordingly.

Animating a 3D Blender model in Codea - part 3
Keyframe spline shaders

In this post, we’ll look at how the keyframe interpolation technique actually works in the vertex shader set up in Codea.

Animating a 3D Blender model in Codea - part 2
Out of Blender, into Codea

In this second post in the series, we’ll focus on getting our animation out of Blender and into Codea.

Animating a 3D Blender model in Codea part 1
Approaches to Coding Animation

Fluidly animated characters are an important part of a 3D game, but getting the animated model out of your modelling software and into your code can be a challenge.

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

The Blender to Xcode workflow
3D Animation for developers

SceneKit is Apple’s high-level 3D API that first appeared on OS X Mountain Lion and made the jump to iPhone and iPad (and now Apple TV) from iOS 8. After having spent a lot of time with Banjax working on lighting shaders and implementing a very basic 2.5D physics engine, it’s something of a relief to use an API that has 3D physics built in and allows complex lighting effects to be achieved with just a line of code. It’s great that it can automatically implement an OpenGLES or Metal rendering engine depending on the hardware and OS its running on. It’s also really exciting to use an API that has supported for animating models with a rig/ armature. But how quickly could I get a rigged character with a run-cycle animation out of Blender and into Xcode and SceneKit?

Animating a 3D Blender model in Codea - part 6
Toon redux

Over on the Codea Talk forum, we begun wondering whether it might be possible to do the toon shader in just a single pass. If you can halve the number of draw calls you make, you will significantly speed up your code. I went back to my original inspiration for the shader, this GLSL Programming e-Book, and found that after the description of the toon shader that ships with Unity (which I adapted in the previous post), they do indeed describe a single-pass method.

Animating a 3D Blender model in Codea - part 5
Toon Shaders

Who’s up for a ‘toon shader? 3D graphics on a computer screen is of course an illusion of depth on a flat surface. Sometimes though, it’s fun, or useful, to lay that illusion bare: to prod the user by pointing out that the painstakingly rendered 3D scene they’re looking at is in fact, a depthless image. Or, perhaps you want to integrate 3D elements in with hand-drawn 2D-style elements, without the former looking too far out of place. Whatever the reason, we see many many 3D visualisations that eschew photorealistic rendering in favour of something a little more, well, flat.

Animating a 3D Blender model in Codea - part 4
Managing multiple animations

In the previous posts in this series, we’ve looked at getting an animated model out of Blender, and then recreating that animation in Codea by interpolating between a set of keyframes with a Catmull-Rom spline in the vertex shader. We ended up with some code that loaded up a model and then animated a continuous walking loop cycle. Chances are though, that for most of the objects in the game, we’re not just going to want them to animate in a continuous loop; rather we’ll want them to respond to various inputs and triggers around them, and respond accordingly.

Animating a 3D Blender model in Codea - part 3
Keyframe spline shaders

In this post, we’ll look at how the keyframe interpolation technique actually works in the vertex shader set up in Codea.

Animating a 3D Blender model in Codea - part 2
Out of Blender, into Codea

In this second post in the series, we’ll focus on getting our animation out of Blender and into Codea.

Animating a 3D Blender model in Codea part 1
Approaches to Coding Animation

Fluidly animated characters are an important part of a 3D game, but getting the animated model out of your modelling software and into your code can be a challenge.

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

The Blender to Xcode workflow
3D Animation for developers

SceneKit is Apple’s high-level 3D API that first appeared on OS X Mountain Lion and made the jump to iPhone and iPad (and now Apple TV) from iOS 8. After having spent a lot of time with Banjax working on lighting shaders and implementing a very basic 2.5D physics engine, it’s something of a relief to use an API that has 3D physics built in and allows complex lighting effects to be achieved with just a line of code. It’s great that it can automatically implement an OpenGLES or Metal rendering engine depending on the hardware and OS its running on. It’s also really exciting to use an API that has supported for animating models with a rig/ armature. But how quickly could I get a rigged character with a run-cycle animation out of Blender and into Xcode and SceneKit?

Codea Talk 2015 Christmas Competition entry
Trench Run

This was my entry for the Codea Talk 2015 Christmas Competition, which had a Star Wars theme to celebrate the launch of The Force Awakens. It only took a day or two to put together, as I already had the .OBJ importer code and the wireframe shader ready, and the models all came from BlendSwap (please see credits below), and had fairly minimal processing in Blender before exporting them as .OBJ files.

Animating a 3D Blender model in Codea - part 6
Toon redux

Over on the Codea Talk forum, we begun wondering whether it might be possible to do the toon shader in just a single pass. If you can halve the number of draw calls you make, you will significantly speed up your code. I went back to my original inspiration for the shader, this GLSL Programming e-Book, and found that after the description of the toon shader that ships with Unity (which I adapted in the previous post), they do indeed describe a single-pass method.

Animating a 3D Blender model in Codea - part 5
Toon Shaders

Who’s up for a ‘toon shader? 3D graphics on a computer screen is of course an illusion of depth on a flat surface. Sometimes though, it’s fun, or useful, to lay that illusion bare: to prod the user by pointing out that the painstakingly rendered 3D scene they’re looking at is in fact, a depthless image. Or, perhaps you want to integrate 3D elements in with hand-drawn 2D-style elements, without the former looking too far out of place. Whatever the reason, we see many many 3D visualisations that eschew photorealistic rendering in favour of something a little more, well, flat.

Animating a 3D Blender model in Codea - part 4
Managing multiple animations

In the previous posts in this series, we’ve looked at getting an animated model out of Blender, and then recreating that animation in Codea by interpolating between a set of keyframes with a Catmull-Rom spline in the vertex shader. We ended up with some code that loaded up a model and then animated a continuous walking loop cycle. Chances are though, that for most of the objects in the game, we’re not just going to want them to animate in a continuous loop; rather we’ll want them to respond to various inputs and triggers around them, and respond accordingly.

Animating a 3D Blender model in Codea - part 3
Keyframe spline shaders

In this post, we’ll look at how the keyframe interpolation technique actually works in the vertex shader set up in Codea.

Animating a 3D Blender model in Codea - part 2
Out of Blender, into Codea

In this second post in the series, we’ll focus on getting our animation out of Blender and into Codea.

Animating a 3D Blender model in Codea part 1
Approaches to Coding Animation

Fluidly animated characters are an important part of a 3D game, but getting the animated model out of your modelling software and into your code can be a challenge.

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

The Blender to Xcode workflow
3D Animation for developers

SceneKit is Apple’s high-level 3D API that first appeared on OS X Mountain Lion and made the jump to iPhone and iPad (and now Apple TV) from iOS 8. After having spent a lot of time with Banjax working on lighting shaders and implementing a very basic 2.5D physics engine, it’s something of a relief to use an API that has 3D physics built in and allows complex lighting effects to be achieved with just a line of code. It’s great that it can automatically implement an OpenGLES or Metal rendering engine depending on the hardware and OS its running on. It’s also really exciting to use an API that has supported for animating models with a rig/ armature. But how quickly could I get a rigged character with a run-cycle animation out of Blender and into Xcode and SceneKit?

Codea Talk 2015 Christmas Competition entry
Trench Run

This was my entry for the Codea Talk 2015 Christmas Competition, which had a Star Wars theme to celebrate the launch of The Force Awakens. It only took a day or two to put together, as I already had the .OBJ importer code and the wireframe shader ready, and the models all came from BlendSwap (please see credits below), and had fairly minimal processing in Blender before exporting them as .OBJ files.

Animating a 3D Blender model in Codea - part 4
Managing multiple animations

In the previous posts in this series, we’ve looked at getting an animated model out of Blender, and then recreating that animation in Codea by interpolating between a set of keyframes with a Catmull-Rom spline in the vertex shader. We ended up with some code that loaded up a model and then animated a continuous walking loop cycle. Chances are though, that for most of the objects in the game, we’re not just going to want them to animate in a continuous loop; rather we’ll want them to respond to various inputs and triggers around them, and respond accordingly.

Animating a 3D Blender model in Codea - part 2
Out of Blender, into Codea

In this second post in the series, we’ll focus on getting our animation out of Blender and into Codea.

Animating a 3D Blender model in Codea - part 6
Toon redux

Over on the Codea Talk forum, we begun wondering whether it might be possible to do the toon shader in just a single pass. If you can halve the number of draw calls you make, you will significantly speed up your code. I went back to my original inspiration for the shader, this GLSL Programming e-Book, and found that after the description of the toon shader that ships with Unity (which I adapted in the previous post), they do indeed describe a single-pass method.

Animating a 3D Blender model in Codea - part 5
Toon Shaders

Who’s up for a ‘toon shader? 3D graphics on a computer screen is of course an illusion of depth on a flat surface. Sometimes though, it’s fun, or useful, to lay that illusion bare: to prod the user by pointing out that the painstakingly rendered 3D scene they’re looking at is in fact, a depthless image. Or, perhaps you want to integrate 3D elements in with hand-drawn 2D-style elements, without the former looking too far out of place. Whatever the reason, we see many many 3D visualisations that eschew photorealistic rendering in favour of something a little more, well, flat.

Animating a 3D Blender model in Codea - part 4
Managing multiple animations

In the previous posts in this series, we’ve looked at getting an animated model out of Blender, and then recreating that animation in Codea by interpolating between a set of keyframes with a Catmull-Rom spline in the vertex shader. We ended up with some code that loaded up a model and then animated a continuous walking loop cycle. Chances are though, that for most of the objects in the game, we’re not just going to want them to animate in a continuous loop; rather we’ll want them to respond to various inputs and triggers around them, and respond accordingly.

Animating a 3D Blender model in Codea - part 3
Keyframe spline shaders

In this post, we’ll look at how the keyframe interpolation technique actually works in the vertex shader set up in Codea.

Bringing split-screen 2-player gaming to the App Store
Beware the Banjax is out now!

Salt Pig’s first release, Beware the Banjax, is now available on the App Store for iPhone, iPad, iPad Pro, and iPod Touch.

Or, towards Xcode on iPad
Wish list for a future version of Swift Playgrounds

Time for some crystal ball gazing with my wish-list/ predictions for a future version of Swift Playgrounds. The top 5 are things that would change the scope of what a playground could be quite dramatically, the rest is more minor stuff. A couple of these items might be unlikely to happen, but most are feasible, I think.

Exploring iOS 10 (dev beta 7)
Creating external modules for Swift Playground Books

Being able to compile code on an iPad with Swift Playgrounds in the beta of iOS 10 is a complete joy. With certain coding tasks however, the performance of the playground code isn’t all that great. It might run a little sluggishly, or you might find it refuses to run when you start to increase the number of objects in your code. I’d noticed this while playing with SceneKit. I found I could only add around 40 cubes to a scene before the playground refused to run. Yet I knew that Apple’s Learn to Code playground books had significantly more complex scenes running in their liveviews, and that these were powered by Swift/ SceneKit code being compiled live on-device. So why wasn’t my playground performing better?

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (beta 1)
First impressions of Swift Playgrounds for iPad

One of the most exciting announcements at this year’s World Wide Developers’ Conference was that Swift Playgrounds, a feature of Xcode on the Mac since version 6, is coming to iPad. Although presented at the WWDC keynote as being primarily an educational portal, with a storefront where users can download content along the lines of iTunes U or the interactive textbooks of iBooks, as the Playgrounds session video makes clear, Playgrounds on iPad is an immensely versatile app. It promises to blur the boundary between developer and consumer, and as such there are at least three ways it can be used. First, there is the aforementioned downloadable content. Second, users can create their own playgrounds in-app, starting from a blank page, or from a number of templates. Finally, Xcode developers can author content in the new Playgrounds Book format, and will be able to distribute it via the storefront. In this series of post I’ll just be looking at the second of these use-cases, using iPad Playgrounds as a stand-alone developing tool to create playgrounds from scratch, in-app. Throughout the series I’ll focus on the question, is iPad Playgrounds a serious developing tool?

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

The Blender to Xcode workflow
3D Animation for developers

SceneKit is Apple’s high-level 3D API that first appeared on OS X Mountain Lion and made the jump to iPhone and iPad (and now Apple TV) from iOS 8. After having spent a lot of time with Banjax working on lighting shaders and implementing a very basic 2.5D physics engine, it’s something of a relief to use an API that has 3D physics built in and allows complex lighting effects to be achieved with just a line of code. It’s great that it can automatically implement an OpenGLES or Metal rendering engine depending on the hardware and OS its running on. It’s also really exciting to use an API that has supported for animating models with a rig/ armature. But how quickly could I get a rigged character with a run-cycle animation out of Blender and into Xcode and SceneKit?

Exploring iOS 10 (dev beta 7)
Creating external modules for Swift Playground Books

Being able to compile code on an iPad with Swift Playgrounds in the beta of iOS 10 is a complete joy. With certain coding tasks however, the performance of the playground code isn’t all that great. It might run a little sluggishly, or you might find it refuses to run when you start to increase the number of objects in your code. I’d noticed this while playing with SceneKit. I found I could only add around 40 cubes to a scene before the playground refused to run. Yet I knew that Apple’s Learn to Code playground books had significantly more complex scenes running in their liveviews, and that these were powered by Swift/ SceneKit code being compiled live on-device. So why wasn’t my playground performing better?

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

The Blender to Xcode workflow
3D Animation for developers

SceneKit is Apple’s high-level 3D API that first appeared on OS X Mountain Lion and made the jump to iPhone and iPad (and now Apple TV) from iOS 8. After having spent a lot of time with Banjax working on lighting shaders and implementing a very basic 2.5D physics engine, it’s something of a relief to use an API that has 3D physics built in and allows complex lighting effects to be achieved with just a line of code. It’s great that it can automatically implement an OpenGLES or Metal rendering engine depending on the hardware and OS its running on. It’s also really exciting to use an API that has supported for animating models with a rig/ armature. But how quickly could I get a rigged character with a run-cycle animation out of Blender and into Xcode and SceneKit?

Or, towards Xcode on iPad
Wish list for a future version of Swift Playgrounds

Time for some crystal ball gazing with my wish-list/ predictions for a future version of Swift Playgrounds. The top 5 are things that would change the scope of what a playground could be quite dramatically, the rest is more minor stuff. A couple of these items might be unlikely to happen, but most are feasible, I think.

Exploring iOS 10 (dev beta 7)
Creating external modules for Swift Playground Books

Being able to compile code on an iPad with Swift Playgrounds in the beta of iOS 10 is a complete joy. With certain coding tasks however, the performance of the playground code isn’t all that great. It might run a little sluggishly, or you might find it refuses to run when you start to increase the number of objects in your code. I’d noticed this while playing with SceneKit. I found I could only add around 40 cubes to a scene before the playground refused to run. Yet I knew that Apple’s Learn to Code playground books had significantly more complex scenes running in their liveviews, and that these were powered by Swift/ SceneKit code being compiled live on-device. So why wasn’t my playground performing better?

Exploring iOS 10 (dev beta 4)
Why code on an iPad?

Playgrounds as developing environments

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (beta 1)
First impressions of Swift Playgrounds for iPad

One of the most exciting announcements at this year’s World Wide Developers’ Conference was that Swift Playgrounds, a feature of Xcode on the Mac since version 6, is coming to iPad. Although presented at the WWDC keynote as being primarily an educational portal, with a storefront where users can download content along the lines of iTunes U or the interactive textbooks of iBooks, as the Playgrounds session video makes clear, Playgrounds on iPad is an immensely versatile app. It promises to blur the boundary between developer and consumer, and as such there are at least three ways it can be used. First, there is the aforementioned downloadable content. Second, users can create their own playgrounds in-app, starting from a blank page, or from a number of templates. Finally, Xcode developers can author content in the new Playgrounds Book format, and will be able to distribute it via the storefront. In this series of post I’ll just be looking at the second of these use-cases, using iPad Playgrounds as a stand-alone developing tool to create playgrounds from scratch, in-app. Throughout the series I’ll focus on the question, is iPad Playgrounds a serious developing tool?

Animating a rigged model in SceneKit
First steps in SceneKit

So you’ve exported your rigged model from Blender as an Xcode-friendly .dae as described in the the previous post. In this post we’re going to look at bringing the .dae into a Swift SceneKit project in Xcode, and programmatically triggering the animations.

Or, towards Xcode on iPad
Wish list for a future version of Swift Playgrounds

Time for some crystal ball gazing with my wish-list/ predictions for a future version of Swift Playgrounds. The top 5 are things that would change the scope of what a playground could be quite dramatically, the rest is more minor stuff. A couple of these items might be unlikely to happen, but most are feasible, I think.

Exploring iOS 10 (dev beta 7)
Creating external modules for Swift Playground Books

Being able to compile code on an iPad with Swift Playgrounds in the beta of iOS 10 is a complete joy. With certain coding tasks however, the performance of the playground code isn’t all that great. It might run a little sluggishly, or you might find it refuses to run when you start to increase the number of objects in your code. I’d noticed this while playing with SceneKit. I found I could only add around 40 cubes to a scene before the playground refused to run. Yet I knew that Apple’s Learn to Code playground books had significantly more complex scenes running in their liveviews, and that these were powered by Swift/ SceneKit code being compiled live on-device. So why wasn’t my playground performing better?

Exploring iOS 10 (dev beta 4)
Why code on an iPad?

Playgrounds as developing environments

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (beta 1)
First impressions of Swift Playgrounds for iPad

One of the most exciting announcements at this year’s World Wide Developers’ Conference was that Swift Playgrounds, a feature of Xcode on the Mac since version 6, is coming to iPad. Although presented at the WWDC keynote as being primarily an educational portal, with a storefront where users can download content along the lines of iTunes U or the interactive textbooks of iBooks, as the Playgrounds session video makes clear, Playgrounds on iPad is an immensely versatile app. It promises to blur the boundary between developer and consumer, and as such there are at least three ways it can be used. First, there is the aforementioned downloadable content. Second, users can create their own playgrounds in-app, starting from a blank page, or from a number of templates. Finally, Xcode developers can author content in the new Playgrounds Book format, and will be able to distribute it via the storefront. In this series of post I’ll just be looking at the second of these use-cases, using iPad Playgrounds as a stand-alone developing tool to create playgrounds from scratch, in-app. Throughout the series I’ll focus on the question, is iPad Playgrounds a serious developing tool?

Exploring iOS 10 (dev beta 7)
Creating external modules for Swift Playground Books

Being able to compile code on an iPad with Swift Playgrounds in the beta of iOS 10 is a complete joy. With certain coding tasks however, the performance of the playground code isn’t all that great. It might run a little sluggishly, or you might find it refuses to run when you start to increase the number of objects in your code. I’d noticed this while playing with SceneKit. I found I could only add around 40 cubes to a scene before the playground refused to run. Yet I knew that Apple’s Learn to Code playground books had significantly more complex scenes running in their liveviews, and that these were powered by Swift/ SceneKit code being compiled live on-device. So why wasn’t my playground performing better?

Exploring iOS 10 (dev beta 4)
Why code on an iPad?

Playgrounds as developing environments

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (beta 1)
First impressions of Swift Playgrounds for iPad

One of the most exciting announcements at this year’s World Wide Developers’ Conference was that Swift Playgrounds, a feature of Xcode on the Mac since version 6, is coming to iPad. Although presented at the WWDC keynote as being primarily an educational portal, with a storefront where users can download content along the lines of iTunes U or the interactive textbooks of iBooks, as the Playgrounds session video makes clear, Playgrounds on iPad is an immensely versatile app. It promises to blur the boundary between developer and consumer, and as such there are at least three ways it can be used. First, there is the aforementioned downloadable content. Second, users can create their own playgrounds in-app, starting from a blank page, or from a number of templates. Finally, Xcode developers can author content in the new Playgrounds Book format, and will be able to distribute it via the storefront. In this series of post I’ll just be looking at the second of these use-cases, using iPad Playgrounds as a stand-alone developing tool to create playgrounds from scratch, in-app. Throughout the series I’ll focus on the question, is iPad Playgrounds a serious developing tool?

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (beta 2)
UIKit layouts in Swift Playgrounds for iPad

In this post I’m going to further explore the new iPad Swift Playgrounds app, currently available with the beta of iOS 10. As with the previous post, I’m focussing on what iPad Playgrounds is like as a tool for developing code, starting from the in-app blank playground template, and building a responsive UIKit interface. At the end I will touch on creating Playgrounds on the Mac in the new Playgrounds Book format, and bringing those onto the iPad.

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Exploring iOS 10 (updated for beta 4)
Sandbox, meet Playground. SceneKit & SpriteKit in Swift Playgrounds for iPad

In the Platforms State of the Union session at WWDC, it was announced that the Playgrounds app would place “the entire iOS SDK at your fingertips.” In theory, that makes it an incredibly powerful tool. At present, the only other iOS App that allows you to “code native” (ie using Apple’s extensive API library), is the insanely great Continuous, a C# / F# IDE for iOS that enables you to use Apple’s APIs via the cross-platform Xamarin environment (which was recently purchased by Microsoft). Xamarin offers C# wrappers for the whole of the Apple SDK, and even manages to do this in sync with Apple’s own release cycles (Xamarin support for iOS 9 released the same day iOS 9 did). Continuous released just a few weeks ago, not long after the beta of iOS 10 launched. No sooner had I got used to coding native on my iPad with the new Swift Playgrounds, than a second app came along with a similar feature set.

Or, towards Xcode on iPad
Wish list for a future version of Swift Playgrounds

Time for some crystal ball gazing with my wish-list/ predictions for a future version of Swift Playgrounds. The top 5 are things that would change the scope of what a playground could be quite dramatically, the rest is more minor stuff. A couple of these items might be unlikely to happen, but most are feasible, I think.

Built with Jekyll      © Salt Pig Media