What

Holofunk is an audiovisual, gestural live looper. With no buttons, no remotes, nothing in your hands at all, you can create and conduct a whole ensemble made entirely of yourself.

Holofunk lets you spontaneously and improvisationally create music, that you can directly see and manipulate as you create it. It feels something like fingerpainting with sound.  It also supports two players, two-handed fully ambidextrous control, and multi-hand gesturing.

Graphically, Holofunk is a Windows 10 desktop Unity 3D app that integrates with the Kinect V2 (the Xbox One version).

On the audio front, Holofunk is a VST plugin host with the demonstrated ability to run full effect chains — it uses the JUCE sound library which is well tested and robust.

The current version just started working again in September 2019 after a three year rewriting process. Here’s a demo from a recent afternoon at Patchwerks in Seattle. Skip to 16:20 if you want to go right to music making.

I had the pleasure of appearing on the Monster Planet Oddcast recently, and got my audio and video both mixed into the surreal result. Enjoy!


 

Old (pre-2015) Holofunk

Here I am presenting the previous version (using SharpDX and all-sprite graphics) at the Seattle Mini Maker Faire in 2015.

And here is a bit of my friend Dane and I using it together, two-player style, at the Jigsaw Renaissance makerspace in Seattle in early July 2014:

Who

My name is Rob Jellinghaus. By day I work for Microsoft. Holofunk is my moonlighting project; Microsoft has an explicit moonlighting policy allowing personal projects such as this, for which I’m genuinely grateful. (All Microsoft software components in Holofunk are publicly available.)

Why

I have done a fair amount of a cappella singing, which I’ve always loved, especially harmony. And I’ve also been involved in the rave scene for decades now. But live singing and techno all too seldom came together; my two loves were disunited.

Then I encountered a brilliant UK musician, Beardyman. He recorded himself live, then played himself back, over and over, chopping up the sound in a million ways. This video in particular blew me away when I first saw it four years ago:

There, his entire performance is live, but all the video is (very artfully and painstakingly) cut up and edited after the fact. The concept is so immediately understandable, I started wondering whether it could be done entirely live.

I’ve heard Beardyman say while performing that it’s hard for people to believe he’s doing it all live, because — unlike that video — in his live sets, one can’t see the multiple overlapping loops, but only hear them.

Holofunk tries to fix that, by making each sound into something you can both see and touch.  I wanted to make complex music using something that had no buttons, nothing that you had to hold still over and peer down at.  Holofunk comes closer to this vision than anything else in the world that I presently know of.

When

At this writing it is September 2019.  I started working on Holofunk over eight years ago.  In fact, in September 2011, I demonstrated the very first working version — which at the time used the first Kinect, a Wiimote over Bluetooth, and a wired microphone — to Beardyman himself in Vancouver:

Between 2011 and 2015 I added support for multiple players and multiple monitors / viewpoints, increased the resolution, added sound effect support, ditched the Wiimote in favor of the new Kinect’s awesome hand pose detection, moved from hand-held microphones to wireless headsets, built the current performance rig, upgraded to x64 support, and shaken the bugs out of VST plugin support.

Then in 2016 I realized that it needed a lot of work: The future is further than you think.

In 2017 I rewrote the graphics in Unity, moving it to a 3D environment.  In 2018 I rewrote the audio in C++ with Windows 10 AudioGraph. In 2019 I rewrote the audio again with JUCE, reviving VST support and adding FFT functionality for actual sound visualization.

And that brings us to the present moment: BREAKTHROUGH.

Leave a comment