This is our story...

In a very special guest write-up and video premiere for Nest HQ, Andy of The M Machine (pictured above) talks about his work on the group’s intricate live visual set-up, (as demonstrated in the new “LIVE video mix” of Shinichi Osawa’s “Tiny Anthem” remix). In addition to visuals, Andy also supplies the lead vocals on this track. Check his guest write-up and the new video below:

photo (1)

When we first conceived of The M Machine, we decided that the project would place a heavy emphasis on visuals, particularly during our live performances. During the project’s infancy, I designed, built, and programmed a large LED light wall in the shape of our “M”, and we made it the central focus of our live show. For every piece of audio that we triggered in our live show, I programmed a corresponding light sequence on the M. By syncing audio clips in Ableton with their corresponding light sequences we could showcase a perfect visual representation of every sound we performed.

After touring with the “M” exhaustively (including a marathon month-long tour across North America with Porter Robinson), we decided that it was time to start conceiving a new way to represent our music visually and present a more immersive and cinematic live experience.

Frankly, I felt that I had exhausted the creative possibilities of my “light instrument” – there’s only so many ways to represent a sound with 36 “pixels” (which is, essentially, what each light panel comprising the “M” is). I wanted to greatly expand my creative landscape while maintaining the aesthetic of a perfectly synchronized visual representation of our sound. Enter the virtual “M”.

Using TouchDesigner, a real-time node-based 3D and visual composition software, we created a virtual “M” that can represent the sequenced light patterns exactly as our physical “M” can, with the added ability to display video on the face of the “M”. By compositing the 3D generative “M” with fully rendered scenes and camera movement created by our “Metropolis” teaser artist Scott Pagano, we created an engine that allows us to zoom in and out of the visual world displayed on the face of the “M” in real time.

Using video to create visual sequences adds an incredibly powerful new dimension to our live show. In this example, I’ve created a video remix of the music video for our track “Tiny Anthem”, reenvisioning the video in the context of Shinichi Osawa’s frantic remix of the track. After chopping our music video into numerous clips, I load them into Resolume Avenue and “sequence” them into a cohesive narrative using Ableton via OSC. By automating clip speed, video effects, and masking in Ableton, I can create synchronized video mixes to expressively represent the audio in nearly limitless ways. And by mapping the video onto the face of our virtual “M” in Touch Designer, I can zoom in and out of the “M” in real time, creating a truly immersive visual experience that more perfectly represents our sound and approach to pushing the boundaries of creative technology as far as we can.

The M Machine

The M Machine on