PlanMixPlay


Summary

PlanMixPlay is, undoubtedly, my biggest undertaking yet. I became inspired when I saw 2manydjs and their under the covers tour. It's the kind of artistic work that seems to intuitive and direct. With performing DJ's having become a mainstay of music industry, it baffles me that more artists aren't trying to do more with the visuals. I think there's a lot of potential, and perhaps even more potential hiding behind what a live performance interface could be, when the whole screen is malleable.

While not exactly ubiquitous yet, larger touch devices are becoming more affordable and widespread with every passing day. It won't be too long until these larger devices will have consumer friendly prices (provided there's a demand), and I think leveraging such a device for live performances is incredibly exciting. Interactive surfaces have a number of significant short comings that need to be addressed, but these matter the most when reimplementing existing interaction metaphors. I think it's time we move beyond virtual vinyl spinning behind a solid piece of glass.

Associated Publications

  • A Multi-Touch DJ Interface with Remote Audience Feedback
    Lasse Farnung Laursen, Masataka Goto, Takeo Igarashi, in ACM MM '14: The 22nd ACM International Conference on Multimedia, pp. 1225-1228
    Published PDF

Quick Facts

  • Built from the ground up in C++11.
  • Relies on a series of solid libraries including: Cinder, BASS, and more.
  • Utilized for live performances on a semi-weekly basis on Slayradio.
  • Implements variable immediacy - an industry first!
  • Has its own website.

The Vision

Most popular performance instruments utilizing pre-recorded media are of a different generation. They are evolving but have a solid rooting in a device decades old. Namely, the vinyl turntable. I think there's a lot of room for innovation. I'm not talking about something that will directly improve upon or supplant existing tools, but rather add something completely new to the landscape of performance instruments. Something that allows for previously impossible, or just unfeasible, events and interactions. This is where PlanMixPlay comes in. I see three incredibly potent areas of development with PlanMixPlay:

Simultaneous Live Audio/Video Performance

In very general terms, most performance instruments are either primarily concerned with outputting audio or video. Audio and video individually have seemingly endless depths to what can be expressed and consequently observed/heard by an audience. Most instruments that produce both audio and video output often do so as a function of the other. A notable exception to this would be Max/MSP/Jitter. It'd take several paragraphs of carefully crafted text to exactly detail how PlanMixPlay differs, but the most accurate brief statement would be to say that PlanMixPlay adopts editing metaphors (common in production environments), and adapts them for the purposes of a live performance. Most other performance instruments do not provide a performer with the same flexibility when dealing with both audio and video. Max/MSP/Jitter does, and provides a encompassing yet quite complex method of doing so. One way of looking at it, is to say Max/MSP/Jitter is more of a toolkit to build performances and/or performance instruments.

I will emphasize that providing audio and/or video output as a function of the other is a natural evolution of existing interaction patterns. However, when the foundation for those interactions are intrinsically tied to audio or video, it is likely that the interactions will continue to favor one or the other. PlanMixPlay has been designed from the ground up with the purpose of balancing the interaction pattern to favor both audio and video.

Variable Immediacy

One of the most fascinating aspects of live performances to me, is that of liveness, perhaps most deeply explored by Auslander. For me, the term is most easily explained in conjunction with immediacy: The time with which an action causes a reaction. In a performance context, this would be a DJ scratching a record and an audience perceiving that record-scratch via attached speakers. Another example would be a drummer striking a drum and the sound being immediately audible. Liveness in a performance context often refer to performances that have low (fast) immediacy and utilize instruments that maximize expressiveness. That last aspect is important, as otherwise one could use the definition to claim that a cd player is the epitomy of liveness.

Sergi Jordà contributes a concrete formula defining the efficiency of a musical instrument thusly:

MusicInstrEfficcorrect =
MusicOutputComplexity × PerformerFreedom / ControlInputComplexity
where
PerformerFreedom = PerfFreedMovement × PerfFreedChoice

In very brief terms, his formula states that a musical instruments performance is maximized when it's output complexity is maximized in conjunction with the performers freedom, and the input complexity is minimized. In other words, an efficient musical instrument is one that allows for a lot of flexibility both in terms of output and performer freedom, yet makes it as easy as possible for the performer to supply this input.

With all that out of the way, we can finally return to Variable Immediacy. It's a method with which each performer can separate themselves from liveness to the point that they are comfortable with. Imagine for a moment that you're a novice guitar player. Strumming a single note is difficult at first, but eventually possible to complete in a timely fashion. As the number of notes that must be hit increase, as well as increase in complexity (chords), the act of playing the guitar becomes far more difficult. Learning to do this expediently and accurately is all part and parcel of learning to play a guitar. But what if you could separate yourself from time? What if you could (seamlessly) move into the future and play a few chords, and return to right now and continue your playing. That is variable immediacy. Separating performers from time, allowing them to build and perform more complex compositions.

Audience/Social Media Integration

The vast majority of live performance instruments does not integrate any sort of feedback from the audience. Not only are there several good reasons for this, there are also credible arguments to be made that this shouldn't change. Personally, as "dangerous" as it can be to integrate audience feedback, there are situations where it's becoming increasingly attractive. Remote broadcasting/streaming has grown at an incredible rate over the past years. It's a trend I'm sure most agree with, will continue. A distributed audience is becoming a common fixture, and while performance artists could previously rely on their entire audience being right in-front of them to see and hear, such things are quickly becoming a thing of the past.

I think there's incredible potential in integrating audience feedback directly into a performance interface. There are - of course - many pitfalls to avoid, as well as careful considerations to make in terms of how to balance and best encourage interaction between the performer and the audience. But it's a source of incredible potential I think is worth exhaustively exploring.

Acknowledgments and Links

Numerous individuals were involved in both testing and providing helpful feedback as part of this project. I'd like to thank Phil Huey, Makoto Nakajima, David Roy, Ryan Modality, and Jan Rod for both helping in early interviews, participation in later user studies, as well as in the first private in-house party use of PlanMixPlay. Thank you also to Mark Jackson for participating in the latter event.



© Lasse Laursen 2015 - 2021