Month: September 2014

Wiglets – The Beauty of Biological Jazz and Artificial Life

Life is about improvising.  We’re given a framework of reality to live in, some innate biology, and then whoosh we’re dropped into the world to do our thing.  What that thing is and how we actually do it is completely up to us to figure out.

Jazz is similar.  Start with a framework of musical concepts, add some instruments, and then jam your way to a magical moment of pure creative expression in music.  And you can even jam with other people.

The connection between jazz and life is immediately apparent to anyone who has watched a newborn try to stand and walk on wobbly legs.  Or following a flock of birds wheeling across the sky in a coordinated cloud of chaos.  Or the first time your dog finally managed to catch that frisbee, dropping it proudly at your feet and gazing directly into your eyes with a cocked head, ready for another throw.

Being part of such moments gives us joy, because we are surrounded by improvisational life,  full of movement and growth and interaction and creativity.

This is why I’m fascinated by Wiglets.  

And why I think the world will be, too.


Wiggle Planet is creating Wiglets as a platform for incredibly fun experiences with mobile devices.  Wiglets are self-animated, moving with all the beauty and jazziness of biological creatures.  They have their own DNA and can evolve through natural or selective breeding.  And they can move freely between on-screen game environments and physical-world locations using augmented reality.

Wiglets combine all the beauty and complexity of biological jazz with the innovative possibilities of artificial life and the fun of games.

At a recent e-Learning conference, I gave a keynote presentation on the power of designing digital experiences that speak to our brain’s natural way of seeing the world.  In short, we love to be surrounded by life.  We are deeply satisfied and feel joy when watching living creatures doing their thing in the world.  And even more so if we can reach out and interact with them through touch.

Wiglets give us a new opportunity to create these magic moments.  Don’t believe me?  Watch these two videos and then think about how they made you feel.

So how can you get your hands on Wiglets?

There are some demo mobile apps you can download right now on the Wiggle Planet website, as well as a Wiglet-enabled book and T-shirt on the Wiggle Planet Store.  Much more will be coming soon, so stay tuned.

If you happen to be a developer who’s into writing games with Xcode and might be interested in taking some Wiglet code for a spin, please drop me a line.  Starting this month I’ll be helping Wiggle Planet with community development and creative advice, so expect to see me blogging more about Wiglets in the future.  Jeffrey and I worked together at Linden Lab, and I’m tickled pink to be able to brainstorm with him again as well the rest of the Wiggle Planet team.

Is your curiosity piqued?  Have questions?  Please leave a comment for us, and thank you for your time.

Take care,
 – John “Pathfinder” Lester



The Open Brain

No, this is not a blog post about being open-minded.

Nor is it a blog post about brain surgery.


It’s a blog post about Open-Sourcing the code of wiglets allowing others to develop artificial intelligence (AI) algorithms.


The fact is, even though I have a graduate degree from MIT, I may not be the best one to write the AI code for wiglets.

Okay, maybe I am  the best … BUT, do I have the time? Is there any time left in the day as I try to start a company?

Bloody no.

And besides, open-sourcing the AI component of autonomous animated characters is totally reasonable, considering that the primary goal of our technology is to allow for user-generated content: digital goods created by all you people out there in user-land. I want the wonderful world of wiglets to emerge from the populace – not from the board rooms of marketing teams.

Your creativity and interest can be the driving factor for how these critters come into being, and eventually evolve into the muppets of the digital age.

So, how will we make the brain open-source?

The key is to use the four pillars of situated AI:


Think of actuators as the body. Your body acts on the environment (and generally changes the environment in the immediate vicinity of the body). The sensors perceive the environment and inform the brain of what’s going on. The brain then takes it in and decides (or not) what to do.

Here’s the cool part: what happens inside of the brain can be just about anything. When I was at MIT, Marvin Minsky told a bunch of us that the brain is a magnificent hack: there is no single perfect AI algorithm. In fact, there are many many hacks that have been messily munged together over the course of animal evolution to give us the brains we have.


It’s our frontal lobes that create the illusion that we are making clear, rational decisions – that our brains are well-designed.

This is why some of the early AI programmers made the mistake of looking for the perfect AI. It would seem (to them) that there must be a way to engineer that perfect-feeling of clarity that we call consciousness and rational thought.

But it’s just a feeling.

Uh, what’s my point? My point is that we can take this fact of animal intelligence and apply it to the simulation of wiglets. You (the folks who I’d like to be put in charge of building the brains of wiglets) get to use whatever you want to make wiglets do what they do.

Think of it as crowd-sourced AI.

You can use neural nets; you can use finite state machines; you can design a thousand if-then statements to account for every combination of stimuli; you can attach a big pipe to Google and use the power of the internet; you can make it completely random and hallucinogenic.

Uh, what?


Let’s start a Cambrian Explosion of Brain Design!

I have finished version 1 of the Brain Interface, which implements the sensors and actuators (the inputs and outputs). If you, or if anyone you know – knows the C++ language and would like to try out our new brain interface, let me know :)


The Future of Augmented Reality Is On Top of Your Nose


Can Muffin  see these characters?

No.  She just happened to look in that direction when the picture was taken. But, the person taking the picture saw them. And guess what…

The characters can look back.

There are two reasons why we decided to use augmented reality to show off our characters:

1. We want people to play with our characters. Augmented reality is hot right now (at least among techies and nerds). Why not take advantage of a trend? And who knows – maybe it’s not a trend.

2. Augmented reality is an ideal venue for autonomous characters with geolocation. (What does that mean? I can’t tell you. If I tell you I will have to kill myself).

It tires out your arms to hold a smartphone or tablet up in order to view digital content overlaid onto the camera’s view of reality. But I think this is a transitory form factor for augmented reality. Eventually, augmented reality will be overlaid onto glasses that we wear on our heads. Having these images overlaid on glasses would free up our hands.

The idea of SmartGlasses is not as creepy to me as it used to be.  Maybe it’s because I’ve been thinking about it a lot now, or maybe it’s because I’m getting used to the idea of Google Glass. But before I get into that, let me show you a video about a new augmented reality  glasses product from Epson:

Yes, Google and Epson are pushing the post-human agenda by encouraging us to put technology on our bodies. And yes, this is creepy. But as long as glasses are a device that can be easily taken on and off, I am less worried about augmented reality becoming a predator to normal reality.

For the same reason that I sometimes want to turn off my cell phone and put it in my pocket, I should be able to take off my smartglasses and stick them in my pocket.

But enough about the future of augmented reality. I am more excited about the future of self-animated characters! Here’s a video I recently made about how we will design a book where the characters interact with the reader (using augmented reality – that is…the old fashioned kind).

Using Real-World Gravity in Augmented Reality

Here in Wiglet Land, we are working on using the accelerometers in smartphones to determine the direction of gravity for our augmented reality apps.

I noticed a short video that appears to be detecting gravity in order to tell the creepy character to shift its balance.

directions-300x237This is just the beginning of what can be done when real-world directions (such as the direction of gravity) are used as input for animations.

But there’s more. Smartphones and tablets are becoming increasingly aware of their orientations in the real world, which includes detecting magnetic north – you know – in the neighborhood of Santa Claus.

For those of you who know the math, if you know two directions (such as the direction of gravity and the direction of magnetic north) you can apply what is known as a “cross product”, and do a few tweaks to the directions to come up with a pretty good approximation of a third (THE third orthogonal direction) which is either east or west.

This is cool, because it means you can tell the augmented reality system the orientation of your “virtual camera”.  The result of this is that you can situate your 3D content more realistically in a real-world setting.

Stay tuned for updates from Wiggle Planet on the use of real-world gravity in our apps. Meanwhile, here’s a video I made a few months ago that shows a test of the gravity vector, to be used for characters in games: