Professor Rob Toulson (University of Westminster) and Professor Justin Paterson (London College of Music at University of West London) have developed a unique interactive-music release format for iOS – variPlay. They tell us about the possibilities of interactive music, academic research, what variPlay offers and why this is relevant to the BPI Innovation Hub.

Professor Rob Toulson (University of Westminster) and Professor Justin Paterson (London College of Music at University of West London) have developed a unique interactive-music release format for iOS – variPlay. They tell us about the possibilities of interactive music, academic research, what variPlay offers and why this is relevant to the BPI Innovation Hub.

Most of us are all too aware that streaming is currently carrying the greatest momentum for delivering music to the consumer. We welcome its convenience and often happily pay, if not for a subscription, then perhaps for a mobile data-plan that facilitates Fremium listening. Vinyl resurgence apart (another story), gone are the days of the physical clutter of CDs and the virtual clutter of an enormous library of downloads with all the hassle of syncing bits of it to mobile devices. Playlists are great too, but beyond such benefits, how has streaming improved the listener experience? We still just listen to a version of a track that in essence plays its music exactly the same as Emile Berliner’s Gramophone in the 1880s, or a CD some 100 years later.

Of course, playback-quality improved, we evolved to stereo, then to digital, but the nature of the music that we hear has largely remained cast in stone. However, it's interesting to note how DJs subverted this with dual turntables, and how much consumers (at least those who were dancing) enjoyed this. Of course, in response to this, electronic music composers often designed their musical arrangements to expect such treatment, and in turn, such music further empowers the DJ. Video-game composers also go to great lengths to make their music adapt to game play, perhaps changing length and arrangement, in order to 'fit'. Now consider going to a rock concert; if the band sounds exactly the same as the recording that you know, it might feel a little dull. Much of the excitement comes from the energy of performance and the variations that this introduces. The broadcast industries are moving towards object-based models where content can be reassembled at the point of delivery, optimised not just to playback device, but tailoring runtime length to consumer preference.

Still, despite this, the vast majority of music playback is static. Why? Perhaps the main reason is the mode of delivery… we have the definitive version of a song and we encode it on a physical medium or a server. Despite this, a typical smartphone – perhaps the most common playback device today – has the power not just to play back a stereo version, but to do many more things as well. It might also stream a video or create engaging graphics in real-time. Why not use this computing power to offer exciting new directions to the very music, empowering the artist’s creative vision and offering the listener an enhanced and more exciting experience? Today, recorded music can be interactive, and this opportunity has been considered by a number of visionary artists, app developers and academics.

In 1993, Todd Rundgren released ‘No World Order’ which claimed to be the world’s first interactive-music CD-ROM – yup, it’s been happening since then! Some 20 years later, the CD-ROM has long been superseded by the app, for instance Björk gave us the exotic Biophilia, and Peter Gabriel released his MusicTiles, both of which offered degrees of user interactivity. These were all generally bespoke standalone products, and although MusicTiles hinted at being available for other artists, it largely remained as a one-off. There have been numerous other interactive-music apps developed commercially, and whilst some have offered unusual modes of changing the music such as geolocation, many have replicated the classic 'recording-studio approach' and allowed users to alter the volumes of a number of music stems to create unique mixes. Whilst ‘uniqueness’ is certainly possible, it is difficult to create a really interesting mix, or certainly one that is better than the original that the artist and producer intended – all volumes unchanged.

We might immediately think of start-up when it comes to industry innovations, but academia has shown considerable momentum in interactive music. There has been a long-standing interest in algorithmic composition, commonly working under MIDI control, and it is commonplace for performers to build bespoke software-interfaces to manipulate their music in real time, using languages such as Max or Supercollider. This led to the well-established knowledge base that could also offer interactive systems to the consumer. For example, it was Henrik Hautop Lund, a professor of robotics at Technical University of Denmark who did much of the development of MusicTiles, and Professor Mick Grierson of Goldsmiths, University of London led creation of BRONZE, the algorithmic system that powers the 24-hour-long ‘Route One’ by Sigur Ros – without it ever repeating. Whereas MusicTiles is user controlled, BRONZE is totally machine controlled.

Our personal contribution to this world started in 2014, funded by the Arts and Humanities Research Council (AHRC). We conducted an investigation into interactive music-playback, and then developed an iOS app – subsequently christened ‘variPlay’ – that was user influenced, but machine controlled. We identified the need for artist/producer quality control of the sound, and so rejected any MIDI-based solutions. We also wanted to create a framework that artists could populate with their own creative response, thus offering a large range of innovative reactions to the technology and effectively creating a release format. We were also wary of ‘gamification’ – meaning that the listener has to ‘do something’ to the music in order to benefit from any enhanced listening experience.

As with numerous other app-based systems, we elected to use stems as the sound source, but we wanted to bypass the clumsiness of faders and other controls typical of the recording studio, and this demanded a bespoke and ultra-simple interface, underpinned by hidden-yet-intelligent management of audio crossfades for a totally smooth user experience, that controls the 36-track audio engine. Artists could deploy this system to a range of levels, recycling instrumental tracks that were muted when the final mix was printed in the studio, integrating authorised remixes, creating bespoke content to extend the possibilities of the app playback, utilising alternative studio takes, and more. Some artists could use this technology to allow users to segue between genres in real-time listening, whereas others could create the effect of a live band that just plays a little bit differently every time.

It was important that the app could respond to user guidance as to roughly how a given track might come out, but also that it could take autonomous decisions to produce myriad variations that relate to the arrangement of a song; rather like live-yet-automatic remixing. This meant that the listener could do the dishes yet still hear a unique version of the song being created in real time – every time, and in a style that the user chose; maybe rock on Saturday evening or chill-out electronica on Sunday morning. variPlay also features comprehensive rich-media content: credits, lyrics, artist bio, animated GIF buttons and links to all-things-Internet, from Youtube to Spotify playlists and artist website – overall, giving an integrated hub for the fans. The app also features many screens of interface that might carry branding.

Of course, streaming services now offer incredibly valuable listener-data that reports how many people are listening to a track, and where. variPlay’s content and capability goes way beyond this by phoning home not just equivalent data, but listener behaviour, including which styles or versions were being listened to in which sections of a song. So, if for instance a certain territory seems to be enjoying an unplugged version of a song, there is certainly no point paying for the orchestra in that leg of the tour, or if there is a general preference for versions with a less grandiose chorus, then this could influence future artist development.

The project culminated in the release of an app for the artist Daisy and The Dark – ‘Red Planet EP’, an academic book chapter that documented the work, and a patent application. The AHRC has since provided further funding to help to commercialise the variPlay format, and we are currently releasing a number of other artists around the world, including a song that went to viral chart No. 1 in nine countries. The next scheduled release is ‘Plan (Something Good)’ by Defab1 ft. Vanessa Knight. This is an electro-funk single that features five different base remixes, including an all-acoustic version. variPlay then creates 25 remixes from this source material, and by automatically switching between them – aligned to the song structure, hundreds of variations are possible. The app can be downloaded from the Apple App Store from mid September 2018.

We are currently looking for further artists and labels that would like to release on the format, and we have significant funding that can support this. Visit http://www.variplay.com/ for more details.

Work like this aligns perfectly with the mission of the BPI Innovation Hub. The hub seeks to introduce new tech-ideas to forward-thinking labels in order to enhance what they can offer the consumer. The variPlay format offers a unique listener experience, enables artists to offer a deeper creative vision to their fans, gives labels novel data on listener behaviour, and a product that they can either sell or distribute as a powerful marketing tool. Perhaps it will not be long before music listeners view static playback as belonging with the gramophone and the CD…