Experiments with the Leap Motion Controller, one year on.

One morning, in March 2013, I woke with excitement and anticipation, all packed and ready to head out to Serbia for the Resonate Festival. By coincidence, that same morning, I also received the awesome news that I would be getting a Leap Motion Controller Dev Kit. I applied in the summer of the previous year, and hadn’t ever expected to actually get one, so this news, combined with the awesome few days ahead at the Resonate conference had me really excited and looking forward to the next few months of creative potential.

After some correspondence with the Leap Team and a couple of weeks patiently waiting for the device to make it’s way through UK customs, I received the dev kit and started playing. When I initially applied to be part of the developer programme back in June 2013, I cited my interests in possibly developing apps which incorporate gestural interaction and sound which is something I’ve done a bit of in the past, during my MA, and sadly, something which I haven’t done much of since, luckily for me, this turned out to be an area which Leap were interested in pursuing and so a lot of my thoughts about potential projects centered around these themes.

In the 18 months or so since, I have been experimenting with some of the different possibilities that the Leap opens up, of which there are many. This post is a brief overview of some of the demos that I have produced and my thoughts on developing for the device.

In pursuit of the concept that I initially pitched to Leap Motion, I created a few experimental pieces in which I explored different ways to either generate or manipulate existing sounds.

Finger Tenori-on

One of the most obvious and immediate concepts that sprang to mind, was the idea of creating a very simple Tenor ion style sequencer. In this example, there is a play head which constantly moves from left to right on the screen; when the user then places one of more fingers in the path of the play head, a musical note is emitted. The pitch of the musical note is modulated by the height of a finger and the overall speed of playback is affected by the position of the hand on the Z axis of the Leap Motion controller.

Sample mixing interface

Building on the work that I did during my MA in Interaction design, I also liked the idea of experimenting with a sample mixer, of sorts, where the user could have multiple samples represented on screen as small nodes from which the volume, bandpass and sample, could be controlled through gestures. The video below shows a working demo of the kind of interface that could be used with the leap motion. Whilst I didn’t get round to getting the sound modulation controls linked up to this, my main goal was to see how effective a more advance gesture based interface could be used to interact with media.

Interactive Hologram

In addition to purely software based interaction demos, I was curious to see how the leap could be integrated with other hardware. One of the most successful demos that I developed made use of the leap motion controller as an input device for interacting with holographic displays. Shortly after starting work at 9 Yards Creative, I was shown the DreamOC Hologram unit – the DreamOC essentially displays a hologram-like image using the pepper’s ghost effect, whereby a screen is mounted in a box facing downwards onto three two-way mirrors. These mirrors are angled at 45 degrees sloping downwards, away from the middle of the screen, in a pyramid like fashion, the over all effect is that the visual content displayed on the screen appears to float in the air. Working in collaboration with a 3D Modeller, I developed a simple leap controlled product package which the user could both rotate through 360 degrees on both the X and Y axes and zoom in and out by moving their hand over the leap sensor in front of the model.
This proved popular when we demonstrated it at a client demo day as it provided users with a new and interesting way to interact with products in a digital context.

Over the past 18 months, since its release, the leap motion controller has proven itself to be an innovative device, it’s extremely responsive and I think it has real potential. There are a couple of things which have become apparent to me, during the time that I’ve spent tinkering with the Leap.

From the outset, it is clear that this is a revolutionary new input device which, on the most part, requires new thinking in the realms of gestural interaction. The extreme accuracy of the leap also needs to be tamed somewhat to prevent it being over responsive, so applications need to be designed to anticipate certain user behaviours at certain points during interaction, which takes careful consideration and testing to perfect. Often, when working with the Leap, I get a sense that it is far too responsive and doesn’t really compensate for the fact that our hands don’t naturally stay still, even when we aren’t consciously moving them. There is also the fact that the Leap is a device which, in its current form factor, needs to be connected to a computer, limiting the possible applications that it could be applied to. With the current increase in power, functionality and popularity of portable devices, perhaps a Bluetooth version, or even the integration of the Leap within devices themselves would do more to extend the possibilities of gestural input devices.

To begin with, one of my main concerns was that the tracking is extremely responsive, very fast and highly accurate, which is great, as long as you can keep a reference to what you’re tracking. The Leap tends to temporarily and sporadically lose sight of fingers and other points in view. The nature of the tracking means that when a point is lost, either because of lighting, hand or finger overlap, or indeed when a point enters or exits the field of view, a new id is applied and there is no way to maintain the position of that point. This prevents the tracking of individual fingers from being very robust, which is what is needed if more advanced interfaces are to be developed in future. On a positive note, the recent V2 release of the Leap software has dramatically increased the reliability of tracking and it is becoming far more enjoyable to work with. The Leap now seems able to maintain tracking of hands in a much more robust way, and is even able to identify when hands are being turned with palms facing up and away from the controller. It is often easy to lose sight of the fact that the technology is in its infancy and that advancements and improvements are being made constantly and ultimately, kudos is due to companies like Leap Motion and Thalmic Labs for their pioneering work in the field of gestural interaction.

In my experience so far, it would seem that the Leap is a truly great device with a lot of potential, for which there is a lot of untrodden ground in terms of learning how to develop new forms of interaction. It is very much down to the early adopters to determine how simple gestures like selection, drag and drop, directional movements and, so on, should be interpreted. As far as the developer programme is concerned, I’m extremely grateful to have been given the opportunity to work with the leap throughout the developer preview and the support of the leap development team has been second to none. It’s been a great opportunity and I’m excited to be working with some great new tech, which I hope will broaden the discipline of interaction design.

Advertisements

3 thoughts on “Experiments with the Leap Motion Controller, one year on.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s