Owen Hindley

I'm a British-born, Iceland-based digital artist creating work for stage, screens and spaces using code, electronics and sound.

I've made and directed VR theatre, interactive music experiences, an enormous game of PONG, a pinata-smashing robot, music videos, a little video game and an ARG for Xbox, amongst other things.

Here's the story so far :

Works & Performances

Dustin O'Halloran - Harmonic Dream Sequence

Audio-visual meditations on an AI-flavoured future
  1. March 2024
  2. Design, development
  3. Touchdesigner

I met Dustin for the first time in late 2023, whilst in the gym at Hafnarhaus (our wonderful co-working space in downtown Reykjavík), an notable encounter not just because I'm so rarely anywhere near a gym, but also because it resulted in him asking me to create a visual piece to accompany a track on his new album, 1 0 0 1.

'Harmonic Dream Sequence', in his words, is a psychedelic depiction of the current naive desire to experiment with technology, with a lack of concern for some possible future where it gains autonomy from its creators and users. This really resonated with me, and so I set about trying to create a visual system that would capture both the complexity and scale of this AI-driven future, hinting at the anxiety-inducing speed, the unknowable biases buried deep within it, and also look really, really cool.

dustin-harmonic-dream-sequence-video

The resulting video is intended to be a poetic visualisation of both the wonder and anxiety produced by our collective ability to develop technology that not only reflects our own world back at us in unsettling, distorted and disruptive ways, but how it can also - through modern neural networks and 'AI' systems - appear to create new worlds for us to explore.

Created in TouchDesigner, and using a mix of real LIDAR scans and simulated (hallucinated?) depth scans, millions of particles represent the raw data we feed into these ever-hungry machines. At first they form strange, uncomfortable shapes which periodically resolve into recognisable forms before collapsing again into chaos. Then structure emerges, as we train the system to recognise connections, patterns and layers in the data, before we dive into it, triggering chaotic lightning flashes of connection between disparate pieces of data, and producing unexpected outputs at incredible speed.

dustin-harmonic-dream-sequence-video

But despite having access to an entire Internet's representation of reality and the results of incredible computing power, we're left with a uneasy feeling at the results, the ghostly, dreamlike sensations it leaves us with making us ask uncomfortable questions about our own minds, and the future it beckons us towards.

  • Dustin O'Halloran:

Tíð / Winter Lights Festival Reykjavík 2024

Audiovisual projection installation with Þorsteinn Eyfjörð
  1. February 2024
  2. Design, development
  3. Touchdesigner
tíð-owen-hindley-þorsteinn-eyfjörð-vetrahátið-2024

A tryptych of giant faces loom over you on a cold winter's night, accompanied by a pulsing soundtrack. Maybe you recognise some of them as they fade through the noise.

'Tið' is a playful installation by Þorsteinn Eyfjörð and myself for Reykjavík's annual Winter Lights Festival (Vetrarhátíð) inspired by the poem 'Barn' by Steinn Steinarr. The giant floating heads are created from LIDAR scans of members of the public,constantly cycling through their self-reported stages of life.

Audio-reactive, and interactive, we invited members of the public to come and have their faces 'scanned' inside the warmth of Hafnarhaus and look at the work whilst enjoying a waffle or two.

Created as part of Hafnar.haus Hringekja, also produced by Þorsteinn and myself, consisting of an exhibition of this and two other projected works by Atli Bollason and Heimir Freyr Hlöðversson, plus over 20 video pieces by members of the creative Hafnarhaus community.

tíð-owen-hindley-þorsteinn-eyfjörð-vetrahátið-2024 tíð-owen-hindley-þorsteinn-eyfjörð-vetrahátið-2024
  • Hafnarhaus: >
  • Þorsteinn Eyfjörð Þórarinsson: >
  • Atli Bollason: >
  • Heimir Freyr Hlöðversson: >

Existential Soup

Live Audio/Visual performance with Þorsteinn Eyfjörð
  1. May 2023
  2. Design, development
  3. Touchdesigner

A work-in-progress live audio-visual performance with sound artist Þorsteinn Eyfjörð Þórarinsson, premiering at Raflost festival 2023.

It was my first time working together as a duo with Þorsteinn, although we'd previously worked together on other projects. Without realising what was happening, the ambition of the project grew into an entirely new 40-minute audio-visual composition, with new ground being explored on both sides - Þorsteinn creating a number of absolute bangers ranging from choral-esque laments to infra-bass driven sonic workouts, but always keeping a thread of 'extreme beauty' throughout.

For my part, I was looking to explore more into these LIDAR-inspired point cloud structures, especially the way that they can be applied to 3d-rendered material to give the feeling of detail and movement, but that only exists in the head of the viewer.

The entire show was performed live with TouchDesigner and a pair of MIDI controllers, with no pre-rendered video used, and only a small amount of audio-reactive input from the music. We're hoping to take this show on the road to festivals in the near future.

  • Raflost: >
  • Þorsteinn Eyfjörð Þórarinsson: >

Into The Universe Of Technical Images

Video work
  1. April 2023
  2. Design, custom coding
  3. Touchdesigner, Unity

Nine looping 4K video artworks, as part of a group show inspired by Vilém Flusser's 1985 work of the same name.

I was thrilled to be invited to participate in Reykjavík Design March 2023 as part of one of Iceland's first group shows exhibiting digital artists on 9 huge 4K displays.

uti-portfolio

Curated by Nils Wiberg, the theme of the exhibition comes from Vilém Flusser's philosophical work which describes the concept of Technical Images, an abstraction above film, video, text and drawings, that describe and convey concepts in ways that the previous mediums cannot.

uti-portfolio

He goes on to describe a possible future in which we're all co-creating these images together in 'darkened rooms' - a chamber orchestra that jointly maps reality, and that distribution of the means of production of these images could help lead us to a utopic understanding of the world, but also warns that centralised control and censorship of these images would lead to deep repression and a dystopic future. For me the text, if you squint a bit, gives a really poetic and prescient way of visualising our current media landscape, from the co-created, mass-consumed reality of Instagram and TikTok, to the state-level media control in countries such as China and Russia.

uti-portfolio

The pieces were inspired by the LIDAR work I've been doing with Volta and Aristokrasia, made in Unity, with big thanks to Parik Ontkovic for camerawork, Liam Cobb, Uta Reichardt, Íris Erlings René Boonekamp and Hafnarhaus for the recording test process.

uti-portfolio
  • Exhibition at Design March:

Fallax

VR Film
  1. 2021-current
  2. Co-director, Art Director
  3. Houdini, Unity, VR
vrShort-portfolio

A fantastical journey into the Icelandic highlands and beyond with two escaped convicts and their driver.

Performed by circus artists during lockdown, recorded using full-body motion capture.

Work in progress! Upon release, the piece will be available for Oculus Quest and other major VR platforms.

  • Hikapee Circus Theatre :
  • Target3D :

This Machine Throws Pixels

Ongoing performance practice
  1. 2007-current
  2. VJ
  3. Touchdesigner, Unity

Since I was a student I've gone through stages of performing as a visualist alongside musicians for concerts, raves and club events.

Starting when Pendulum came to our university in 2007, and I (unprompted) threw together a bunch of pre-rendered footage into Resolume and jammed it out, I've loved the experience of playing with images live, responding to the music as it comes.

Since then it's been a nice offshoot of my regular practice, which often involves making some kind of real-time system for visuals anyway.

VJ-ing

In the beginning of 2023 it's received a bit of a boost with a bunch of underground, DIY-ish shows in Reykjavík, and I've developed a TouchDesigner-based system to be able to show up and perform pretty much anywhere, including live control over any lights in the venue, multiple projectors, that kind of thing.

Often things I learn from doing this makes its way back into my 'professional' work, little visual tricks, or some weird combination of effects that you end up coming up with in the moment.

Parallel People / Hliðstætt Fólk

Interactive VR theatre
  1. 2020-2022
  2. Design, development, direction
  3. Unity, Houdini, VR

A live interactive social theatre experience about trust, communication and questioning your sources of information.

Nominated for Best Immersive Performance at Raindance Immersive 2022.

One day, you wake up and find yourself trapped in a transparent box in a hostile, barren landscape - but it appears you're not alone.

There is the Caretaker, who says they just want to keep you safe inside this box, promising to show you wonders beyond imagining.

There are the Others, individuals trapped in boxes just as you are - some of whom are acting strangely.

And the Voice in your head - together all of them are talking, acting, behaving at odds with one another - so who will you believe?

Based on my theatre company Huldufugl's international award-winning VR experience 'Kassinn / A Box In The Desert' (one of the world's first experiences of its kind), this show expands the narrative to allow five audience members to determine their own story against a barrage of conflicting voices.

My role on this project was lead developer, director and live show operator, with the concept and script co-created by the whole amazing team.

The show takes place entirely in VR using the Oculus Quest headset.

Performances:

  • Þjóðleikhúsið (The National Theatre of Iceland) - April 2022 (Sold out run)
  • Raindance Immersive 2022 - November 2022 (Nominated for Best Immersive Performance)

  • Created by Huldufugl
  • Performers - Nanna Gunnars & Ástþór Ágústsson
  • Design, development & Direction - Owen Hindley, Torfi Ásgeirsson, Steingerður Lóa Gunnars
  • Composer - Íris Thorarins

The Hidden People

Realtime Projections for Aerial Circus Theatre
  1. 2018-2022
  2. Co-producer, Visual designer
  3. TouchDesigner, Unity, Houdini

A collaboration between Hikapee Theatre and Huldufugl, a show for medium to large stages that combines Icelandic folklore, circus arts, theatre and creative technology.

With projected visuals for the entire duration of the show (90 minutes), I used Houdini and Unity to design and render the scenes, working in realtime during rehearsals, then rendering out high-resolution HAP video.

Playback was handled using a custom-made TouchDesigner app, receiving cues via OSC from Qlab, running realtime effects on top of the video, and handling the projection mapping.

Whilst most of the scenes are purely video playback in most cases, with a few scenes are exported as live Unity apps, controlled with an Xbox gamepad live during shows, which just to keep things exciting!

Finally, after several years of rehearsals, covid-induced delays and fundraising, the world premiere took place at Worthing Pavilion Theatre in March 2022. The show plans to be touring worldwide from August 2022.

  • Show information:
  • Hikapee Circus Theatre :

Aristokrasia

Show visuals design + performance
  1. 2022-current
  2. Visual design & development
  3. Touchdesigner, Unity
aristokrasia-portfolio

An audio-visual collaboration with one of Iceland's most established electronic musicians, Úlfur Eldjárn

Premiering at Extreme Chill Festival, Reykjavík, 2022, and ongoing into 2023.

More images + info coming soon.

Out Of Sync

AR Livestream Performance
  1. 2020
  2. Developer, Visual Designer, VJ
  3. TouchDesigner, Unity

Out Of Sync was a slightly mad, experimental performance idea formed in response to the COVID-19 pandemic and climate crisis.

As we all remember, in 2020 many artists across the world were unable to perform to audiences of more than a handful of people (at best), and those countries where audiences can gather in numbers have achieved this largely through (sensibly) strict border restrictions, meaning they are unable to invite international artists to come and play. In addition to this, the global climate crisis remained – albeit backgrounded a bit by the pandemic.

The idea is that a performer in Iceland (at the time gatherings were limited to 10 people or fewer) would live-stream to a club in Taipei, Taiwan. There, a full audience of dancing people (Taiwan had no gathering restrictions at the time) would enjoy the party. However, we had two dancers, or “eyes” inside the club, using mobile phones to provide a two-way video stream back to Reykjavik, where two people at a time would experience the party on the other side of the world inside large wooden booths specially built for the purpose.

Full performance :

  • Event Production : Uta Reichardt, René Boonekamp, Aephie Chen
  • AR Graphics : Owen Hindley & Yuli Levtov (Volta)
  • Performer : Hermigervill / Sveinbjörn Thorarensen
  • Camera : Yuli Levtov, Uta Reichardt
  • Video Editing & Titles : Owen Hindley

Now a 90 minute performance, regardless of how great the music is or colourful the artist's onesie is, might not be the most visually interesting experience if it was a simple camera setup. Luckily, I had been working for several months with my long-suffering collaborator Yuli Levtov on an exciting new project called Volta, which aims to make it easy for performing artists to create visually engaging livestream experiences, either for 2d video streaming or in VR.

Even more luckily, the month before, we had integrated Keijiro Takahashi’s outstanding Rcam2 system into Volta’s existing visual system, allowing us to do a full AR performance using only an iPad Pro and a laptop running Unity + Touch Designer.

We're super happy with the results, which you can watch above. Since then we’ve incorporated a user-facing editor into the Volta product, which you can read more about below. In addition, this same team has applied to produce several more events during 2021 to investigate alternate means of performance and audience experience – both as the world recovers from the pandemic, and as we look forwards to ways to reduce the burden of extensive travel on artists wanting to perform outside of their home countries, whilst still making it a party!

  • Project page :
  • Volta :
  • Rcam2 page :

GALE

Platform game
  1. 2019
  2. Design, Music
  3. Unity
gale3-portfolio

A short platform experience about using the wind to guide your way.

Created for the Isle of Games 002 (see isleofgames.is) exhibition on the theme of “The Tempest”, the player is making their way through a pitch black world, with nothing but the wind to guide their path.

The installation version was for two people to cooperate, one using a joystick to control the character, the other using rotating lava rock + button to control the wind. This version is just controlled using the keyboard.

Credits:
  • Design & Music by Owen Hindley
  • Character controller magic by @torfias
  • Feedback and general advice from @joonturbo, Steingerður Lóa Gunnarsdóttir and the rest of the Isle of Games team.
Download:

Game can be downloaded from itch.io :

Playthroughs / Coverage :

Kassinn / A Box In The Desert

Interactive VR Theatre
  1. 2017-2018
  2. Direction, Design, Development
  3. Unity, VR

Imagine one day, you wake up inside an invisible box, unable to escape.

An engaging, but playful, interactive theatre performance with a live actor in virtual reality.

Will you trust this mysterious stranger? Or listen to the voice in your head, trying to get you out?

One of the world's first pieces of its kind, the show explores what happens by smashing together future-thinking theatre forms with technology.

Created by Huldufugl (an Icelandic / British events company co-founded by myself and writer, actress & producer Nanna Gunnars), the piece is a mix of Virtual Reality, multiplayer gaming and interactive theatre.

The concept initially began as a short play by Nanna, written and performed in a conventional theatre in 2016 – after which I put forward the idea to re-create this piece in VR. We premiered the first version in Reykjavík in August 2017, and developed it for a longer run in July 2018, before taking the show to the Stockholm Fringe Festival in September 2018, and with more international dates planned for 2019!

One person at a time experiences the piece inside of a 2 x 2m physical space, whilst the actor performs in a separate physical space. For our first outing, we used the Xsens motion capture suit (expertly handled by PuppIT), but currently the show involves two Oculus Rift setups, sharing the same virtual space using multiplayer networking.

Show control is handled by an operator, who can observe and control both events in the world and the ‘voice in your head’ via a Qlab interface, and we have a beautiful LED box created by Swedish engineering magicians Svartljus

After touring the piece throughout Europe and to the USA, we went back into the rehearsal room to re-create the experience for more than one audience member at a time. The result is Hliðstætt Fólk / Parallel People, which you can read more about above.

  • Performers : Nanna Gunnars, Ástþór Ágústsson
  • Music : Iris Thorarins
  • Sound : Ragnar Hrafnkelsson
  • Concept Development : Alexander Dan Vilhjálmsson
  • Additional 3D design, build : Jacob Andersson
Media:
  • Behind the scenes at STOFF 2018 :
  • Kassinn at Menningarnótt 2017 (first iteration) :
Huldufugl project page :

The Moon Seat

Audio-visual public installation
  1. 2016-17
  2. Technical lead, sound design
  3. Flash, Processing
Moon Seat
  • The Tab:
  • The Moon Seat at e-Luminate 2016 :
  • Project Website :

The Moon Seat is a playful installation that entertains our inner child whilst simultaneously courting long forgotten childhood fears. The installation had its first public outing at the e-Luminate festival in Cambridge, UK from 12th-17th February 2016, and was located on the front lawn of the prestigious Cambridge Union Society.

Audience members were invited to sit, at which point the pool of moonlight would instantly open to show their shadow. After a few seconds, their shadow would transform into an animal controlled by their bodily movements, but with a character all of its own. An ethereal generative soundtrack reacting to the user's movements accompanies the piece.

I'd worked with Alex for a long time at B-Reel, and I was very excited to be asked to get involved in his first independent installation art piece! It quickly became apparent that we were going to need some more help, so we drafted in ex-B-Reelers and good friends Yi-Wen Lin and Christian Persson to come on board.

We pulled in quite a variety of technologies for this one, including Processing (main show), NodeWebkit (gesture detection & tracking), Flash/AIR (character animation & playback) and Pure Data (generative audio & DMX control)

Horizons VR

Interactive Musical Journeys in VR
  1. 2016-17
  2. Direction, Design, Development
  3. Unity, C4D, VR

A series of interactive VR music journeys with music from Bonobo, My Panda Shall Fly, and Reuben Cainer.

Press / Exhibitions :

After a chance introduction in Iceland via the wonderful people at Bedroom Community, I was invited to submit a proposal for a launch title for Google’s new VR headset, Daydream.

I contacted my long-term collaborator Yuli Levtov and together with our friends David Li and Leif Podhajsky, we laid out plans to create the first music-focused title for this new platform.

After launch, in collaboration with Ninja Tune heavyweight artist Bonobo, the Horizons team (Yuli Levtov, David Li, Leif Podhajsky and myself) saddled up again to create a brand new psychedelic interactive music journey for the Horizons platform on Google Daydream.

The new scene features the track Outlier from his latest album Migration, and allows you to deeply interact with the music, which in turn has an effect on the landscape you fly through. Flocks of birds join you on your journey through sand dunes, towering mountains and beneath the waves.

After the sad demise of the Google Daydream platform, we were approached by HTC to port the title to their range of devices, and it is now available on HTC Viveport for HTC Vive, Focus and Flow.

Created by:
With incredible support from:

Mmorph

Browser-based interactive audio experience
  1. 2016
  2. Lead Develoment, Design
  3. JS, WebAudio, SVG Animation
mmorph by MassiveMusic
  • Site link :
  • Case Study Video :
  • FWA PCA of The Year (Nomination):
  • FWA Site of The Month:
  • Awwwards Honorable Mention:
  • FWA Insights:

mmorph is an adventure into new ways of delivering interactive music in the browser and beyond.

A collaboration between global music agency MassiveMusic, Reactify Music, Grotesk, Enzien Audio and myself, mmorph is an example of a new workflow which we hope will open up many possibilities for interactive audio – first in the browser, and then for games, apps, installations and VR.

The site takes you through an interactive music piece, enabling different musical parts, applying realtime effects, composing and looping a top-line synth, and creating intense build-ups and drops!

My role was lead developer on the project, handling the integration of the audio code from Reactify & Enzien Audio (who in turn were working with original music composed by Massive) with realtime SVG graphics and animations art-directed by Grotesk.

The interactive audio was produced via a unique workflow where Reactify worked rapidly and closely with Massive’s in-house composer in Pure Data, an visual programming environment that allows for real-time prototyping and development.

This Pure Data ‘patch’ was then converted to run in the browser via Enzien Audio’s Heavy compiler. This compiler can also transform the same source patch into code suitable for Unity, Unreal engine, OpenFrameworks, desktop/mobile apps and VR experiences, with little or no alterations to the original.

  • MassiveMusic:
  • Reactify:
  • Enzien Audio:
  • Grotesk Studio:

Harpa Pong

Massive outdoor game
  1. 2014
  2. Design, Development
  3. Nodejs, DMX, HTML5
  • Vice Magazine :
  • Reykjavik Grapevine :
  • Iceland Magazine:
  • Homepage :
  • Menningarnótt site (Icelandic) :
  • Github :

PONG is a massive interactive outdoor artwork that allows two people to play the classic game against each other on the monumental facade of Harpa, Iceland’s flagship concert hall in downtown Reykjavik, designed by Ólafur Eliasson.

Conceived and produced by Atli Bollason, the project began as a chance meeting between myself and Atli at a birthday party in Reykjavik, and over the next few months developed into a reality – with not only the CEO of the building coming on board very quickly, but for the first time for a project like this, Ólafur himself (the first time he has allowed a project like this to take over the facade’s lighting system). Vodafone Iceland also agreed to sponsor the project, and supply us with some of the equipment.

The project was launched on Menningarnótt (Culture Night) on the 23rd August 2014, and ran for a week afterwards as part of the Reykjavik Dance Festival. For the launch night, we setup a stage in front of the building, and players used a pair of phones to control their ‘paddles’ on the 43m-high screen. Afterwards, players were able to join a special Wi-fi network which took them directly to the game, which was transmitted from a hill overlooking the building. This would allow them to join a virtual queue, and play against friends or complete strangers.

Our roles were split with Atli handling the concept, creative direction, project management and publicity, and me handling the design, hardware, networking and programming.

Three separate Node.js servers run together to create the experience; one running in the basement of Harpa which outputs DMX to control the 35×11 pixel display (6 universes of 512 channels each), one running in the cloud which actually simulates the game physics, and receives WebSocket connections from the player’s phone, and another which handles the game queue.

An HTML5 mobile front-end was also developed to give visual feedback to the player of their paddle, and current score. The code for the entire project is available on github.

Collaborations & Commercial

More commercial projects where I've been brought in as a contractor or collaborator, often on the technical side of things.

Imogen Heap - Last Night Of An Empire

Mixed Reality Music Video
  1. 2021
  2. VFX, Developer
  3. Unity, Houdini
lnoae-portfolio

During my time with Volta, we were thrilled to collaborate with world-renowned musician and innovator Imogen Heap to create the official video for her single Last Night Of An Empire.

Volta had at that point been developing its AR capabilities, particularly by integrating Keijiro Takahashi's Rcam2 library, which allowed us to easily integrate a real camera feed from a LIDAR-equipped iPad with 3D objects and visuals generated in Unity, and be able to freely move the camera around the space.

True to her experimental and open spirit, Imogen decided that the recording of the video would also be live-streamed, meaning everything from the camera movment to AR visual effects and Imogen's choreography were repeated and refined over and over again and streamed to Youtube, whilst we searched for the perfect take.

Imogen's loyal legion of Heapsters provided encouragement and comment via the stream, and after just over three hours, the final take was in the bag. But it didn't end there - we also produced a standalone VR-only version of the video to be included as an archived performance inside of Volta.

Ahead of the recording at Imogen's barn studio space in Essex, I got to work creating visual elements in Houdini + Unity, and exposing parameters for on-set creative technologist Alexis Michallek to hook up to the Ableton session running the show so he could choreograph the AR visuals in time with the playback.

For the VR version, along with several new visual effects, I built several extensions to the Unity Timeline system to allow me to work quickly with the existing Volta setup and create a tightly synced experience (separate to the livestreamed video).

The VR video is unfortunately only available inside Volta VR, but you can catch a 2D recording of it here

  • Volta :
  • Official Video :
  • Mini-documentary :
  • Livestreamed making-of recording :

Lambchild Superstar

Music Creation Experience for VR
  1. 2017-2021
  2. Lead Developer
  3. Unity
lcs-portfolio

A psychedelic music-making experience by OKGO, Within and Oculus Studios.

Get it on the Oculus Store here

In 2017, my studio Horizons (with co-founders Yuli Levtov and David Li) was approached by Chris Milk and Aaron Koblin's VR powerhouse Within to help them develop an incredibly ambitious VR project in collaboration with Oculus Studios, and the musical / creative force of OKGO.

The task was to create an experience in VR where the user could produce their very own pop song, complete with drums, bass, guitar, synth, vocals (recorded and auto-tuned by the user) and a special cow-based freestyle solo instrument.

The final piece includes a staggering amount of layers of animation, machine learning, audio analysis and processing, a custom sound engine, VR interaction and musical theory.

In 2018 we were very excited to give the world a peek of the final piece, at both Tribeca Film Festival and Sonar+ in Barcelona.

Two audience members at a time, carefully guided by white-coated docents, could collaborate on building a song together in a purpose-built booth. The piece attracted a significant amount of positive attention, being described as 'Social VR's Rock Band' by Wired, and others.

My role as lead developer ranged from prototyping musical interaction concepts at the start of the project, to co-ordinating the fantastic development team across several different cities and time zones, overseeing the asset pipeline of animated creatures from the art team into Unity, and working directly on various parts of the project.

Released on 25th December 2022, we're excited to see what people create with this fantastic musical tool!

  • Full project description on OKGO.net :
  • Within / Supernatural :
  • Horizons Studio:
  • Oculus Studios:

Unconfined

Samsung S8 Launch at Milan Design Week 2017
  1. 2017-2018
  2. Developer
  3. Unity

Interactive installation by Universal Everything and Zaha Hadid Architects, commissioned by Samsung.

Presented at Milan Design Week 2017 to accompany the launch of the Galaxy S8 phone.

So I’ve been a massive fan of Universal Everything’s work ever since Matt Pyke formed the studio post-Designers Republic, so being invited to join them as a developer on this adventure was a bit exciting, to say the least!

The brief was to produce a real-time, interactive artwork that would be projected onto seven monumental ‘petals’, designed and inhabiting a space by Zaha Hadid Architects, within a relatively short timeframe – and with a lot of press attention on the final result, so no small order.

With a tight timeline, the lead developer Chris Mullany and I developed a system for simulating choreographed bird-like motion onto thousands of unique ‘avatars’ – playful, generative striped characters who would fly and dance around the space.

We worked rapidly in Unity, with Chris handling the creation and rendering of the Avatars, and me handling the flocking motion and choreographies that would play in synchronicity with composed music from Simon Pyke / Freefarm.

The final setup in Milan required an impressive assortment of hardware, handled by an amazing crew of projectionists, d3 operators, sound and lighting designers, and our playback rig – 6 high-end PCs running Unity apps, all synchronised via network from a single server, from which we could adjust and refine all aspects of the show once we arrived in Milan.

HTC Vive Launch Site

Global launch site for HTC Vive
  1. 2016
  2. Lead Development
  3. JS, GLSL, WebGL
> viveready.htcvr.com
  • FWA PCA of The Year (Nomination):
  • FWA Site of The Month:
  • Awwards SOTD :

>Get Vive Ready by HTC, Google and B-Reel was a WebGL experience that invites the user to test if they are ‘Vive Ready’, to promote the launch of HTC’s Vive VR headset.

The site uses a mobile device as a controller for the 3D desktop experience, and requires you to chop, dodge, swing and shake your way through four challenging levels.

Users completing all four levels were able to enter a prize draw to win an actual Vive headset and controller set.

The site involves some cutting-edge WebGL and mobile phone interaction, which we were happy to see recognised in a number of awards including the FWA’s site of the Month (which followed our project Mmorph’s award the previous month!)

Sónar Reykjavík Visuals

Building-scale generative visuals
  1. 2014-16
  2. Design, Coordination, Development
  3. NodeJS, DMX
> Watch 2016 Video
> Watch 2015 Video
  • HarpaPONG:
  • Sónar Reykjavik:
  • Harpa:
  • Press:
  • Festival Insights:
  • VjSpain: (Spanish)
  • Yi-Wen Lin (visuals collaborator) process blog post :

Sónar is an international festival of progressive music and multimedia arts, originally out of Barcelona, but now taking place in several locations worldwide, including Reykjavik since 2013. It is hosted in the landmark building Harpa, a beautiful structure with a unique interlocking cell-like front facade (designed by Olafur Eliasson) each containing an LED light fixture. Combined it forms a large outdoor screen that is visible across much of downtown Reykjavik.

Artist Atli Bollasson and I were lucky enough to be the first outside artists to do something on this facade in 2014, with the publicly-playable arcade game Harpa PONG at Reykjavík’s Culture night. After the success of PONG, Harpa and the Sónar organisers invited Atli and myself to do a repeat installation on the lights of Harpa for the festival in 2015, and again in 2016.

We knew we wanted to do a bit more than simply re-run PONG, so instead we decided – in addition to running the game – we would turn the entire building into an audio-reactive light show, taking the music being played inside and use it to drive the visuals outside.

To do this we reached out to other creative developers, providing them with a code framework, brief, and challenge to do something amazing with only 36 x 11 pixels. We ended up with over 12 functioning visual responses from 8 developers from around Europe, which was way beyond our expectations.

The Light Organ

Audio-visual public installation
  1. 2016
  2. Development
  3. NodeJS, Max/MSP, DMX
  • Vice Magazine / The Creators Project :

HARMONY IS COLOUR.
PITCH IS POSITION.
STRENGTH IS BRILLIANCE.

For a few days in February 2016, visitors of Harpa Music Hall in Reykjavík were invited to play the façade of the building as they would an instrument. A “light organ” was placed on the 4th floor balcony, with a stunning view of the inside of the geometrical glass front and the downtown area.

Anyone who passed through could learn how to play in blue or red or green, with quick flashes or swelling pads of light, and impress the whole city with an optical performance.

Conceived & produced by Atli Bollason.
Developed by:
Yuli Levtov, Reactify
Ragnar Ingi Hrafnkelsson, Reactify
Owen Hindley
Build supervisor:
Jonas Johansson

Resonate 2015 Theme

Festival music composition + Sound Design
  1. 2015
  2. Composer, sound designer

Resonate is a festival that brings together artists to drive a forward-looking debate on the position of technology in art and culture.

Held annually in Belgrade, Serbia, it draws visual artists, programmers, performers, musicians and curators to present and discuss their work, share new ideas, work, knowledge and plum brandy.

For the trailer, FIELD collaborated with director / designer Antar Walker to bring their graphic identity for the festival into motion.

I was approached by FIELD to collaborate with the team on the musical score and sound design for the piece. I immediately brought my long-suffering friend and collaborator Ragnar Hrafnkelsson on board, and together we composed, mixed and mastered the final soundtrack.

In addition to receiving considerable coverage online, the trailer was also aired on Serbian national TV in the run-up to the festival opening.

  • Resonate Festival:
  • Ragnar Hrafnkelsson / Reactify Music:
  • Antar Walker:
  • FIELD.io:

The Piñata Smashing Robot

Physical/digital advertising stunt
  1. 2015
  2. Electronics, Design, Build
  3. Node.js

International agency Isobar collaborated with B-Reel and Groovy Gecko to create a live, web-connected experience that allowed users to smash a real-life piñata using a robotic arm, in order to promote the new flavours of Pringles Tortillas.

Users in the UK and Germany were able to sign up via Twitter and/or Facebook to take a hit at the unlucky piñata, successful entrants would win a Tortillas-related prize!

B-Reel approached me to lead the design and build of the robotic arm – a really fun challenge!

This required a lot of research into fabrication methods, and learning about pneumatics for the first time, combined with electronic control (via Phidgets) and backend development to connect the entire system to the website (being developed by Isobar)

After constructing two beautiful robots (Pedro & Mario, in reserve), we to work smashing piñatas, the arm pounding each one with 20psi of force per hit. Over 40 piñatas a day were obliterated live online, in a custom-built Mexican marketplace set deep within East London.

  • Isobar casestudy:
  • B-Reel casestudy:

Spectra 2

Kinectic sculpture by Field
  1. 2014
  2. Electronic engineering, developer
  3. Arduino, Node.js
  • Creators Project :
  • Field.io :
  • Accept & Proceed :
  • Laurence Symonds :
  • Edu Prats Molner (jocabola) :

Spectra 2, a mesmerising kinetic sculpture by FIELD.io in collaboration with design studio Accept & Proceed was displayed in the studio’s East London gallery space in Summer 2014.

The piece consists of suspended polished steel segments, controlled by stepper motors & custom electronics, that form rippling terrains inspired by NASA lunar meteor impact data.

Collaborating with lead engineer Laurence Symonds, FIELD and A&P, I was responsible for the design & implementation of the motor software control system, which played back a pre-choreographed sequence (pre-visualised and designed in Houdini by FIELD). I was also very happy to be part of the superhuman build, test & installation team.

Custom Arduino firmware and a unique serial protocol needed to be developed in order to synchronise movement across all 48 motors for the duration of the hour-plus long sequence. A Node.js server was written to handle playback and synchronisation with a wall-mounted display, and triggering of audio samples via OSC.

XBox Glitch ARG

Online ARG / Cryptic competition
  1. 2014
  2. Concepting, development and sound design
  3. HTML5, PHP, Server administration
  • Huffington Post :
  • Campaign Live :

‘The Glitch’ was a highly secretive and unusual campaign launched by Microsoft, CP+B and B-Reel that took players on a journey across the Internet, challenging them to solve puzzles and crack codes to win a mysterious prize.

This formed part of the UK XBox One launch, and part of Microsoft’s effort to engage directly with hardcore gamers by working on their level. The glitch took the form of a 1 second ‘disruption’ to the Xbox TV collaboration that had already been on regular rotation on UK TV channels for some time. Hidden within this were a number of codes and clues that would set players off on their journey.

I headed up the development side of the project team at B-Reel, and was responsible for laying out this network of cryptic sites across the web in such a way as to give no clue as to who, or what was behind the Glitch.

A sophisticated PHP backend system was built to monitor and control all the various endpoints to allow tracking of players as they moved through the contest, and shutting off routes as soon as prizes had been awarded.

In addition, I was also responsible for the glitched-up sound design across all of the routes, which added to the air of mystery and suspense surrounding the campaign.

  • B-Reel :
  • CP+B London :

City Of Drones

Audio-visual experience for web + installation
  1. 2014
  2. Audio and Physics developer
  3. WebGL, WebAudio, DART

City of Drones is an interactive digital environment developed by musician John Cale, speculative architect Liam Young and digital artists FIELD. Charting the story of a lost drone drifting through an abstract cityscape, players are invited to pilot a virtual craft and remotely explore this imaginary world.

Samples from Cale’s original soundscape compositions echo across the landscape as we see the city through the eyes of the drone, buzzing between the buildings, drifting endlessly, in an ambient audio visual choreography.

City of Drones lives online, and as an installation at the Barbican Center, London, as part of the Digital Revolution exhibition.

I was brought on to the project to develop the dynamic audio of the piece with the team at FIELD. This involved remixing original material from John Cale, including bespoke work from BXFTYS, and setting it all in a constantly changing, immersive 3D sonic landscape.

Spot sound effects are triggered by different locations within the environment, and the sounds of drones as they fly past are processed using the WebAudio HRTF panner to accurately position the sounds in 3D space.

The background music layer and ambience also change as you fly through the different areas. In addition, I also contributed to the physics of the drones, giving them some freedom of movement as they fly around the space, avoiding buildings and each other, plus I developed the on-screen HUD graphics and animation.

  • Field.io :
  • Liam Young :
  • John Cale:
  • BXFTYS:
Project Archive

Older projects that define my early career and interests.

Google Web Lab
London Science Museum installation + online experience
  1. 2013
  2. London

Chrome Web Lab is a series of interactive experiments by Google Creative Labs, accessible both throughout the world, and at the Science Museum, London.

The aim was to bring the magic of the web to life, both online and in the physical world.

As part of the development team at B-Reel, I was responsible for aspects of the front-end development across the entire site.

This involved cutting-edge HTML5 development, tight integration with a complex-back-end system, real-time interaction via WebSockets, all in collaboration with the talented team at B-Reel, as well as Tellart and Google.

Ray Ban 75th Anniversary website
HTML5 brand experience
  1. 2012
  2. London
rayban-portfolio

To celebrate the 75th Anniversary of the Ray-Ban brand, a site was commissioned that allowed the user to literally scroll through the decades, highlighting on the way inspirational stories of individuals that represented the brand's “NEVER HIDE” philosophy.

As part of the team at B-Reel, I acted as lead developer on the project, delivering an unparalleled premium HTML5 experience across tablet and desktop.

Subsource Dubumentary
Sound supervisor on independent music documentary
  1. 2011
  2. Guildford

Directed by Colin Arnold at The Surgery Productions, Subsource: A Dubumentary follows UK band Subsource on tour across the UK and Europe, giving a unique insight into what it means to be making it as a touring, gigging group of musicians in 2011.

The film was toured around film festivals in Europe and the UK. I was brought in to mix and edit the audio in post-production for cinema release.

Say What
Mobile lyric matching & rhythm game
  1. 2011
  2. London
saywhat-portfolio

Say What?! is a fast and fun new music game that gets your brain and your fingers moving…And helps you learn the right lyrics to your favourite songs!

On its worldwide launch, the game immediately hit the No.1 spot in the Top Free-Music category on the App Store, and has received coverage in the Guardian & London's Metro newspapers.

I was part of the two-man development team at 8linQ responsible for delivering the app. As well as the game itself, a back-end infrastructure was also developed in-house to deliver in-app purchase content and provide a Facebook-integrated leaderboard mechanic.

Wizarding World of Harry Potter Online
Launch site for Universal Orlando's new Harry Potter attracttion
  1. 2010
  2. USA
beatles-portfolio

Award-winning site involving interactive realtime 3D graphics, large quantities of video and backed up by a national TV advertising campaign.

I joined the team at Mediastation as a Flash/AS3 front-end developer and server-side .NET specialist to assist in building the site, and ensuring it was delivered fully integrated within the client's existing IT infrastructure.

Flash AS3 Developer, Back-end ASP.NET development, IT Liaison to Universal Orlando IT. Also responsible for sound design & music editing across the site.

The Beatles iTunes launch
iTunes LP authoring
  1. 2010
  2. London
beatles-portfolio

The Fab Four's much-publicised arrival on iTunes was accompanied by a set of exclusive iTunes LPs to accompany album purchases.

Users downloading entire albums would recieve an interactive digital pack (the iTunes LP) containing artwork, liner notes and historical notes.

I was part of the in-house team at Metropolis Group that authored each of the 17 LPs in time for the release. iTunes LP authoring, extensive HTML/jScript/CSS3 development and Webkit animation.

Promo video for You Heard Nothing
Promo video creative direction
  1. 2009
  2. Guildford
vrShort-portfolio

I managed and performed with 'You Heard Nothing', a group of DJs, VJs, designers and filmakers based in Guildford, Surrey. We reguarly played nights around the Surrey and London areas, having supported acts such as Mary Anne Hobbs, Zane Lowe, The Freestylers and Pendulum. We held a monthly residency at The Boileroom, Guildford.

I wanted to produce a video that illsutrated in an interesting way the sensation of a night out at You Heard Nothing, and came up with the concept for this video 'Last Night'. Shot over a weekend in Guildford, it follows the series of trailers, interviews and ident that can be seen on our website.

Concept, Casting, Location, Creative Direction, Sound Design & additional Motion Graphics.

Virtual Guru
Museum exhibit & teaching tool for classical Indian vocal music
  1. 2009
  2. London

Created by Flaming Pear Interactive for the Asian Music Centre in West London, The Virtual Guru is a piece of educational software that guides the user through a number of classical Indian raags. The user can first listen to the melody, and then attempt to recreate it with or without musical accompaniment. Their performance is indicated on the screen by a continuous line showing their pitch, overlaid upon a similar line showing the correct or “guru's” performance.

I was tasked with the development and installation of the software, in time for the visit by HRH The Prince of Wales, which made the headlines.

Pitch detection & template creation software in Pure Data (pd), communicating with a C# host via OSC.

Application developed in C#, UI created using WPF.

RAVE realtime video editor
Final year project BEng Audio Media Engineering
  1. 2009
  2. Guildford
rave-portfolio

RAVE allows VJs, conference graphics operators, editors and other video professionals a unique method of editing and displaying video content live, in real-time.

Using a networked system of ordinary PCs, the user can create frame-accurate sequences from existing source material, combine them with other clips or sequences, then instantly assign them for playback on any or all of the attached machines.

The interface is multitouch-compatible, which required (in the pre-Windows 7 world) re-writing the standard Windows input API to take advantage of the additional input.

Concept, interface design, video playback engine and networking were created specifically for the project.

Madame Tussauds Teen Idol - Karaoke machine
Museum exhibit
  1. 2009
  2. London & Berlin
tussauds-portfolio

Created by Flaming Pear Interactive for two new exhibits at Madame Tussaud's London and Madame Tussaud's Berlin, an interactive karaoke system was created that allows you to sing along with hits by Zac Efron, Miley Cyrus and The Jonas Brothers (London version), and Take That, Madonna, Beyonce and Michael Jackson (Berlin version)

The interactives consisted of a touchscreen kiosk plus microphone, and displayed the user's sung pitch continually against that of the original vocal, with animated graphical feedback to let the user know how they performed.

I was responsible for the development & installation of the two interactives, both in London & liasing with on-site staff in Berlin.

Pitch detection & template creation software in Pure Data (pd), communicating with a C# host via OSC.

Application developed in C#, UI created using WPF.

The Fallow Field
Sound supervisor on independent horror film
  1. 2009
  2. Guildford

Amnesiac Matt Sadler awakes alone in the middle of a wilderness with no recollection of the past seven days. Again.

Shot over 8 days in the Surrey Hills, this collaboration between first-time director and Leigh Dovey and producer Colin Arnold matches the beauty of English countryside at harvesttime with the savage and the supernatural. The film was snapped up for international distribution shortly after completing post production, and has appeared at film festivals across the UK.

Sound Supervisor; on-location sound recordist, responsible for all post-production audio- including foley, sound design, music editing, mixdown, mastering & delivery of the final soundtrack.

Hitachi - Inspire Life Exhibition
Multi-touch interactive installation
  1. 2008
  2. London & Paris
hitachi-portfolio

I was tasked with building 22 interactive 'tables' for Hitachi's high-profile 'Inspire Life' event in London & Paris.

tables allowed people to explore Hitachi's products and services using a unique 3D 'rubix cube' interface.

The interfaces would also have to be multitouch, so that a 'spread' motion on the surface would explode the cube, showing the product headings. Pressing an individual cube would display information and images related to that area.

Custom FTIR-based multitouch input system Multitouch input driver using the OSC protocol for Windows / C# Realtime 3D engine using Windows Presentation Foundation (WPF) / C# Networking software for synchronised multi-screen 'train' animation On-site installation & maintenance in London & Paris.

Talks & Education

When I've been asked to lecture or run workshops for students or institutions about my work or projects.

Making Live Visual Machines In TouchDesigner
Visuals workshop at Raflost 2023
  1. 2023
  2. Hafnarhaus, Iceland
VR Masterclass
Lecture / Workshop
  1. 2022
  2. Þjóðleikhúsið (National Theatre of Iceland)
All Watched Over By Lines Of Loving Grace
Rant/discoveries about Unity line rendering
  1. 2021
  2. Innovation House, Reykjavík
Who are Huldufugl?
Lecture
  1. 2021
  2. Berghs School of Communication, Stockholm
Worlding Deserts
Panel discussion about virtual worlds
  1. 2020
  2. Night Of Ideas, Institute Francais in London
Industry Expert Introduction
Lecture
  1. 2020
  2. Hyper Island, Stockholm
Designing Tiny for Massive
Short course for Graphic Design students
  1. 2019
  2. Listaháskóli Íslands (Icelandic University of the Arts)
Tools for Collaboration
Lecture
  1. 2017
  2. Resonate Festival 2017, Belgrade
Processes for creativity
Lecture
  1. 2017
  2. Listaháskóli Íslands (Icelandic University of the Arts)
Creative Technology Workshop
Arduino, web, and creative technology for Design + Communication students
  1. 2017
  2. Hyper Island, Stockholm
Designing for VR
Lecture
  1. 2017
  2. Tækniskólinn, Reykjavík
Dynamic Architecture
Lecture
  1. 2016
  2. Sónar Reykjavík
Creative Coding For Designers
Course instructor for 2nd year Graphic Design students
  1. 2016
  2. Listaháskóli Íslands (Icelandic University of the Arts)
About & Bio

Owen (he/him) is a collaborative digital artist, working with software, sound, and electronics for the web, VR, performances, devices and installations.

His commercial work has included serving as technical lead on large, cutting-edge projects for clients such as Oculus Studios, Google, Microsoft/Xbox, Samsung and Mercedes, collaborating with digital creatives such as Universal Everything, FIELD and B-Reel. These projects have ranged from an interactive web-connected musical installation at the London Science Museum for Google, Web Lab, to a pneumatic live-streamed pinata-smashing robot.

He is co-founder of Huldufugl, a Reykjavik-based theatre and events company experimenting with new mediums in theatre, both technological and immersive.

He is also co-founder of Horizons Studio, a London-based creative studio producing interactive musical experiences in VR. Their debut title, Horizons VR was commissioned by Google as a launch title for their Daydream platform, with releases on Oculus and HTC platforms planned for 2019.

He was also Sound Supervisor and on-set sound recordist on two short independent films, The Fallow Field and Who We Are, and sound editor on the Subsource Dubumentary.

Awards
Nominations
Links
Recent Employment history
  • 2020-2021 Senior Developer, Volta XR
  • 2014-2020 Freelance Developer / Artist
  • 2011-2014 Senior Developer, B-Reel London