owenhindley at
+44 7708 322236
+354 762 0292

I'm a freelance creative technologist working with software, sound and electronics
for the web, broadcast, performance, installations & devices.

Based in London & Reykjavík, working worldwide, always looking for adventures.

Here's the story so far..

Public Installation
Atli Bollason
NodeJS, Max/MSP, DMX


Conceived & produced by Atli Bollason.
Developed by:
Yuli Levtov, Reactify
Ragnar Ingi Hrafnkelsson, Reactify
Owen Hindley
Build supervisor:
Jonas Johansson
For a few days in February 2016, visitors of Harpa Music Hall in Reykjavík were invited to play the façade of the building as they would an instrument. A “light organ” was placed on the 4th floor balcony, with a stunning view of the inside of the geometrical glass front and the downtown area.

Anyone who passed through could learn how to play in blue or red or green, with quick flashes or swelling pads of light, and impress the whole city with an optical performance.

Press :
The Creators Project
Installation art piece
Alex Jenkins and Owen Hindley
February 2016
Tech lead, sound design
Flash, Processing, JS


The Moon Seat is a playful installation that entertains our inner child whilst simultaneously courting long forgotten childhood fears.

The installation had its first public outing at the e-Luminate festival in Cambridge, UK from 12th-17th February 2016, and was located on the front lawn of the prestigious Cambridge Union Society.

Audience members were invited to sit, at which point the pool of moonlight would instantly open to show their shadow. After a few seconds, their shadow would transform into an animal – still controlled by their bodily movements, but with a character all of its own. An ethereal generative soundtrack reacting to the users’ movements accompanies the piece.
I’d worked with Alex for a long time at B-Reel, and I was very excited to be asked to get involved in his first independent installation art piece!

It quickly became apparent that we were going to need some more help, so we drafted in ex-B-Reelers and good friends Yi-Wen Lin and Christian Persson to come on board.

We pulled in quite a variety of technologies for this one, including Processing (main show), NodeWebkit (gesture detection & tracking), Flash/AIR (character animation & playback) and Pure Data (generative audio & DMX control), all talking together over more than 16 OSC channels.


The Tab: site
The Moon Seat at e-Luminate 2016 : site
Interactive browser-based audio experience
February 2016
Lead Development, Design
Javascript/Coffeescript, WebAudio, SVG
mmorph by MassiveMusic


> Case Study Video

mmorph is an adventure into new ways of delivering interactive music in the browser and beyond.

A collaboration between global music agency MassiveMusic, Reactify Music, Grotesk, Enzien Audio and myself, mmorph is an example of a new workflow which we hope will open up many possibilities for interactive audio – first in the browser, and then for games, apps, installations and VR.

The site takes you through an interactive music piece, enabling different musical parts, applying realtime effects, composing and looping a top-line synth, and creating intense build-ups and drops!


MassiveMusic: site
Reactify: site
Enzien Audio: site
Grotesk Studio: site

FWA Insights: site

My role was lead developer on the project, handling the integration of the audio code from Reactify & Enzien Audio (who in turn were working with original music composed by Massive) with realtime SVG graphics and animations art-directed by Grotesk.

The interactive audio was produced via a unique workflow where Reactify worked rapidly and closely with Massive’s in-house composer in Pure Data, an visual programming environment that allows for real-time prototyping and development.

This Pure Data ‘patch’ was then converted to run in the browser via Enzien Audio’s Heavy compiler. This compiler can also transform the same source patch into code suitable for Unity, Unreal engine, OpenFrameworks, desktop/mobile apps and VR experiences, with little or no alterations to the original.


FWA Site of The Month: site
Awwwards Honorable Mention: site
Online WebGL experience
HTC / Google / B-Reel
February 2016
Lead Developer
Javascript/ES6, WebGL, WebAudio


Get Vive Ready by HTC, Google and B-Reel is a WebGL experience that invites the user to test if they are ‘Vive Ready’, to promote the launch of HTC’s Vive VR headset.

The site uses a mobile device as a controller for the 3D desktop experience, and requires you to chop, dodge, swing and shake your way through four challenging levels.
Users completing all four levels were able to enter a prize draw to win an actual Vive headset and controller set.

The site involves some cutting-edge WebGL and mobile phone interaction, which we were happy to see recognised in a number of awards including the FWA’s site of the Month (which followed our project Mmorph’s award the previous month!)


FWA Site of The Month: site
Awwards SOTD : site
Massive realtime generative visuals
Sónar Festival Reykjavik
Design, Mgmt, Development
Javascript, Node.js, DMX

> Watch 2016 Video

> Watch 2015 Video

Sónar is an international festival of progressive music and multimedia arts, originally out of Barcelona, but now taking place in several locations worldwide, including Reykjavik since 2013.

It is hosted in the landmark building Harpa, a beautiful structure with a unique interlocking cell-like front facade (designed by Olafur Eliasson) each containing an LED light fixture. Combined it forms a large outdoor screen that is visible across much of downtown Reykjavik.

Artist Atli Bollasson and I were lucky enough to be the first outside artists to do something on this facade in 2014, with the publicly-playable arcade game Harpa PONG at Reykjavík’s Culture night.

After the success of PONG, Harpa and the Sónar organisers invited Atli and myself to do a repeat installation on the lights of Harpa for the festival in 2015, and again in 2016.


HarpaPONG: site
Sónar Reykjavik: site
Harpa: site

We knew we wanted to do a bit more than simply re-run PONG, so instead we decided – in addition to running the game – we would turn the entire building into an audio-reactive light show, taking the music being played inside and use it to drive the visuals outside.

To do this we reached out to other creative developers, providing them with a code framework, brief, and challenge to do something amazing with only 36 x 11 pixels. We ended up with over 12 functioning visual responses from 8 developers from around Europe, which was way beyond our expectations.


Festival Insights: site
VjSpain: (Spanish) site
Yi-Wen Lin (visuals collaborator) process blog post : site
Physical robot design & build
Isobar UK / B-Reel London
Summer 2015
Electronics, Design, Build

International agency Isobar collaborated with B-Reel and Groovy Gecko to create a live, web-connected experience that allowed users to smash a real-life piñata using a robotic arm, in order to promote the new flavours of Pringles Tortillas.

Users in the UK and Germany were able to sign up via Twitter and/or Facebook to take a hit at the unlucky piñata, successful entrants would win a Tortillas-related prize!
B-Reel approached me to lead the design and build of the robotic arm – a really fun challenge!

This required a lot of research into fabrication methods, and learning about pneumatics for the first time, combined with electronic control (via Phidgets) and backend development to connect the entire system to the website (being developed by Isobar)

After constructing two beautiful robots (Pedro & Mario, in reserve), we to work smashing piñatas, the arm pounding each one with 20psi of force per hit. Over 40 piñatas a day were obliterated live online, in a custom-built Mexican marketplace set deep within East London.

Isobar casestudy: site
B-Reel casestudy: site
Immersive online experience / installation
Barbican / FIELD.IO
Summer 2014
Audio + Physics, Sound Design
WebGL, WebAudio, Dart

City of Drones is an interactive digital environment developed by musician John Cale, speculative architect Liam Young and digital artists FIELD. Charting the story of a lost drone drifting through an abstract cityscape, players are invited to pilot a virtual craft and remotely explore this imaginary world. Samples from Cale’s original soundscape compositions echo across the landscape as we see the city through the eyes of the drone, buzzing between the buildings, drifting endlessly, in an ambient audio visual choreography.

City of Drones lives online, and as an installation at the Barbican Center, London, as part of the Digital Revolution exhibition. : site
Liam Young : site
John Cale: site
BXFTYS: site
I was brought on to the project to develop the dynamic audio of the piece with the team at FIELD. This involved remixing original material from John Cale, including bespoke work from BXFTYS, and setting it all in a constantly changing, immersive 3D sonic landscape.

Spot sound effects are triggered by different locations within the environment, and the sounds of drones as they fly past are processed using the WebAudio HRTF panner to accurately position the sounds in 3D space.

The background music layer and ambience also change as you fly through the different areas.

In addition, I also contributed to the physics of the drones, giving them some freedom of movement as they fly around the space, avoiding buildings and each other, plus I developed the on-screen HUD graphics and animation.
Kinectic Sculpture / Accept & Proceed
Summer 2014
Electronics, Build, Install
Arduino, Node.js
Spectra 2, a mesmerising kinetic sculpture by in collaboration with design studio Accept & Proceed was displayed in the studio’s East London gallery space in Summer 2014.

The piece consists of suspended polished steel segments, controlled by stepper motors & custom electronics, that form rippling terrains inspired by NASA lunar meteor impact data. : project page
Accept & Proceed : site
Laurence Symonds : site
Edu Prats Molner (jocabola) : site
Collaborating with lead engineer Laurence Symonds, FIELD and A&P, I was responsible for the design & implementation of the motor software control system, which played back a pre-choreographed sequence (pre-visualised and designed in Houdini by FIELD).

I was also very happy to be part of the superhuman build, test & installation team.

Custom Arduino firmware and a unique serial protocol needed to be developed in order to synchronise movement across all 48 motors for the duration of the hour-plus long sequence. A Node.js server was written to handle playback and synchronisation with a wall-mounted display, and triggering of audio samples via OSC.

Creators Project : article
Massive outdoor game
Harpa Concert & Conference Center, Reykjavik
August 2014
Design, Development
Node.js, DMX, HTML5

PONG is a massive interactive outdoor artwork that allows two people to play the classic game against each other on the monumental facade of Harpa, Iceland’s flagship concert hall in downtown Reykjavik, designed by Ólafur Eliasson.
Conceived and produced by Atli Bollason, the project began as a chance meeting between myself and Atli at a birthday party in Reykjavik, and over the next few months developed into a reality – with not only the CEO of the building coming on board very quickly, but for the first time for a project like this, Ólafur himself (the first time he has allowed a project like this to take over the facade’s lighting system). Vodafone Iceland also agreed to sponsor the project, and supply us with some of the equipment.

The project was launched on Menningarnótt (Culture Night) on the 23rd August 2014, and ran for a week afterwards as part of the Reykjavik Dance Festival.
For the launch night, we setup a stage in front of the building, and players used a pair of phones to control their ‘paddles’ on the 43m-high screen.
Afterwards, players were able to join a special Wi-fi network which took them directly to the game, which was transmitted from a hill overlooking the building. This would allow them to join a virtual queue, and play against friends or complete strangers.
Our roles were split with Atli handling the concept, creative direction, project management and publicity, and me handling the design, hardware, networking and programming.

Three separate Node.js servers run together to create the experience; one running in the basement of Harpa which outputs DMX to control the 35×11 pixel display (6 universes of 512 channels each), one running in the cloud which actually simulates the game physics, and receives WebSocket connections from the player’s phone, and another which handles the game queue.

An HTML5 mobile front-end was also developed to give visual feedback to the player of their paddle, and current score.

The code for the entire project is available on github:
Github repo:

Links / Press:

Main site: site
Menningarnótt site (Icelandic): site
Reykjavik Grapevine : site
Iceland Magazine: site
Music Composition & Sound Design
Resonate Festival /
Jan 2015
Music composition
Sound design

Resonate is a festival that brings together artists to drive a forward-looking debate on the position of technology in art and culture.

Held annually in Belgrade, Serbia, it draws visual artists, programmers, performers, musicians and curators to present and discuss their work, share new ideas, work and party hard.

For the trailer, FIELD collaborated with director / designer Antar Walker to bring their graphic identity for the festival into motion.
I was approached by FIELD to collaborate with the team on the musical score and sound design for the piece.

I immediately brought my long-suffering friend and collaborator Ragnar Hrafnkelsson on board, and together we composed, mixed and mastered the final soundtrack.

In addition to receiving considerable coverage online, the trailer was also aired on Serbian national TV in the run-up to the festival opening.

Links / Press:

Resonate Festival: site
Ragnar Hrafnkelsson / Reactify Music: site
Antar Walker: site site
Online ARG / Cryptic competition
B-Reel / CP+B / XBox
Server admin, Sound Design, Front-end Dev
HTML5, PHP, LAMP Administration

‘The Glitch’ was a highly secretive and unusual campaign launched by Microsoft, CP+B and B-Reel that took players on a journey across the Internet, challenging them to solve puzzles and crack codes to win a mysterious prize.

This formed part of the UK XBox One launch, and part of Microsoft’s effort to engage directly with hardcore gamers by working on their level.

The glitch took the form of a 1 second ‘disruption’ to the Xbox TV commercial that had already been on regular rotation on UK TV channels for some time. Hidden within this were a number of codes and clues that would set players off on their journey.

B-Reel : site
CP+B London : site
I headed up the development side of the project team at B-Reel, and was responsible for laying out this network of cryptic sites across the web in such a way as to give no clue as to who, or what was behind the Glitch.

A sophisticated PHP backend system was built to monitor and control all the various endpoints to allow tracking of players as they moved through the contest, and shutting off routes as soon as prizes had been awarded.

In addition, I was also responsible for the glitched-up sound design across all of the routes, which added to the air of mystery and suspense surrounding the campaign.

Or, in the words of one contestant, making them ‘literally shit his pants’.

Interactive art installation
Collaboration with Yi-Wen Lin, Bertrand Carrara for Google DevArt
Sound Design, Music, Weather API
Pure Data, Ableton Live, Node.js

Kuafu is an interactive art installation based on the Chinese myth of the giant that tried to catch the Sun.

The installation itself comprises of a large, wide-format projection of a 3d landscape, through which the Giant walks, braving weather, mountains, and seas in his quest to catch the Sun which is causing a drought amongst his people.

The topography, rivers, oceans and weather are all based on actual data drawn in from live APIs and Google Maps, and visitors can control his path using a mounted tablet running a web-based interface showing where in the world Kuafu is walking.

I joined the fantastic Yi-Wen Lin and Bertrand Cararra on this project in part as an entry for Google’s DevArt competition, in which we made it through to the final 10 entrants!

The project remains still in active development.
My role on this project was as sound designer and composer, but as you can see from the development diary below, we intended to involve as much generative, code-based audio as possible in this project to fit with the 3D aesthetic in the visuals. This will involve extensive work in Pure Data, as well as controlling Ableton Live via OSC from Node.js.

I also worked on the Weather API, translating the incoming latitude/longitude coordinates from the visual engine into data concerning wind speed, direction, and rainfall, so our Giant would have to face the same elements as he would do in the real world at that location!

Also I managed the initial motion capture sessions with live actors to give Kuafu some real personality in his movements.

You can see the development process here:

Project page : site
Project page on DevArt : site
Yi-Wen Lin : site Bertrand Cararra : site

Interactive social music installation
Collaboration with Reactify for DevArt
Design, Concepting, Mobile & Server dev
HTML5, Websockets, Node.js

> Watch Video

Dynamics is an installation comprised of a room of constantly evolving, generative and reactive music, lights and visuals that visitors can interact with via their smartphones.

Upon joining the installation’s Wi-Fi network, every visitor is presented with a slightly different interface on their smartphones, each with a different level or type of influence over their surroundings. The interaction types will be various, ranging from being able to trigger short sounds, through to changing the overall mood of the music. Some interfaces will encourage interaction with other visitors, prompting teamwork and social (as well as musical) interaction.

Yuli (from Reactify Music) and I developed this project in part as an entry for Google’s DevArt competition.
We collaborated closely on this project, with him mostly looking after the music and lights programming, and myself taking care of the node.js server and mobile interface design/development.

You can see the development process here, along with more videos and audio demos:

Project page on DevArt : site
Reactify : site

Immersive interactive installation
Sound design
Pure Data, Node.js

>Star Canvas Case Study (

To celebrate B-Reel London’s 5 year anniversary, the team decided to create a large immersive installation, to be installed at the office as we invited the great and the good of the London advertising world down to party and drink with us!

The huge star chart, powered by WebGL in Chrome and controlled using the new Leap Motion controller, was a huge hit at the party, as people were invited to create their own constellations via a roaming iPad interface.

I was brought in towards the end of the project to add sound to the experience, and in a short space of time, we connected the Node.js server that was at the heart of installation to a Pure Data scene via OSC.

The Pure Data scene itself was setup to involve as few samples or pre-programmed sequences as possible – and let the data from the various constellations (their size and shape) drive the sounds that are played when they are selected.

You can download the Pure Data patches and samples from the following repository:

StarCanvasAudio on Github : link

WebGL code by Yi Wen Lin : blog
The results, when combined with the massive 4m+ wide dome projection system were very satisfactory, and can be heard below:

‘Clap’ sound effects (played when the user clapped their hands over the leap motion)
‘Constellation’ sound effects (played when the user selects a constellation)
‘Drone’ (played on a loop during the experience, all other melodies change relative to the root note)
‘Keyboard’ sound effect (played when the user hits a key on the input keyboard)
Online cross-platform music mashup tool
B-Reel / AMV BBDO / Mercedes-Benz UK
Front-end development
HTML5, Node.js


Launched in conjunction with a nationwide TV commercial, the Sound With Power tool allows the audience to create their own ‘mashup’ music video, using a number of high profile artists’ music and the incredible sound of the E 63 AMG engine.

The site allowed users to dynamically add loops of synchronised video and audio to a composition to create the final track.
The challenge during development was that this was to happen across desktop, tablet and mobile (Android & iOS), which up until this point had rarely been done before – if ever.

Leading the Sound With Power development team at B-Reel, I was responsible for the planning, technical design and delivery of this project.

B-Reel : site
AMV BBDO : site
Music by Ithica Audio: site
We built a customised Node.js-based rendering system to take the large amount of raw video footage for the loops, and convert them into spritesheets ready for playback on the various different devices we had to support.

For the sound, I was also responsible for building a playback system that would remain in time with the video, which would switch between audio technologies (WebAudio, HTML5, Flash) depending on what was available for the playback.

In the end, we delivered a fun, innovative project that won both the FWA ‘Cutting-edge’ and ‘Site of the Day’ awards, and that we feel pushed the boundaries of what can be done with realtime video and audio in the browser, across platforms.

Physical Installation with global online presence
B-Reel / Google Creative Labs
Front-end development


Chrome Web Lab is a series of interactive experiments by Google Creative Labs, accessible both throughout the world, and at the Science Museum, London.

The aim was to bring the magic of the web to life, both online and in the physical world.
As part of the development team at B-Reel, I was responsible for aspects of the front-end development across the entire site.

This involved cutting-edge HTML5 development, tight integration with a complex-back-end system, real-time interaction via WebSockets, all in collaboration with the talented team at B-Reel, as well as Tellart and Google.

TVC / Metropolis Digital
Music Composition, Sound Design
Ableton Live, Pro Tools

>Watch on Vimeo

Catalyst, a nonprofit organisation with aims of expanding opportunities for women in business came to Metropolis with a brief to create a TV commercial that would be shown in Canada.

A team of animators including Tom@Fivesnakes, Oded Shein, and Jon Folkard came together to produce this short, but characterful piece.

Catalyst : site
I was tasked with produce both a musical soundtrack to accompany the piece, and light sound design to bring the animation to life.

Interactive 3D Multitouch Interactive Tables
Hitachi 'Inspire Life' Conference, London/Paris
Nov 2007 - Apr 2008
Hardware design & software development
C#, C++, WPF

I was tasked with building 22 interactive ‘tables’ for Hitachi’s high-profile ‘Inspire Life’ event in London & Paris.

These tables allowed people to explore Hitachi’s products and services using a unique 3D ‘rubix cube’ interface. The interfaces would also have to be multitouch, so that a ‘spread’ motion on the surface would explode the cube, showing the product headings. Pressing an individual cube would display information and images related to that area.
Custom FTIR-based multitouch input system

Multitouch input driver using the OSC protocol for Windows / C#

Realtime 3D engine using Windows Presentation Foundation (WPF) / C#

Networking software for synchronised multi-screen ‘train’ animation

On-site installation & maintenance in London & Paris.
Karaoke Interactive
Madame Tussaud's London & Berlin
May 2009
Software Design & Development
C#, WPF, Pure Data (pd)
Created by Flaming Pear Interactive for two new exhibits at Madame Tussaud’s London and Madame Tussaud’s Berlin, an interactive karaoke system was created that allows you to sing along with hits by Zac Efron, Miley Cyrus and The Jonas Brothers (London version), and Take That, Madonna, Beyonce and Michael Jackson (Berlin version)

The interactives consisted of a touchscreen kiosk plus microphone, and displayed the user’s sung pitch continually against that of the original vocal, with animated graphical feedback to let the user know how they performed.
I was responsible for the development & installation of the two interactives, both in London & liasing with on-site staff in Berlin.

Pitch detection & template creation software in Pure Data (pd), communicating with a C# host via OSC.

Application developed in C#, UI created using WPF.

Owen Hindley 2016