Friday, April 30, 2021

Safwan Chowdhury Project 4 — Music Visualization



Concept:

A good friend of mine and frequent artistic collaborator, Trevor Dean Stewart, is a composing student at Columbia College Chicago. He shared a recent composition named Beam for the New Music Ensemble there, and asked me to develop some visuals for it.

Beam is an extremely complex composition which contains a great deal of synthesized or electro-acoustic music written and performed by Trevor himself, as well as the analog work of 19 other musicians in the Ensemble — it ended up being over 50 tracks when arranged in Logic Pro and Ableton Live.

He and I sat down and listened through an initial mix of the composition and divided it into 6 movements which had a distinct sonic and emotional character, and some of which were mostly digital or alternatively mostly analog. We talked through some of the ideas he had for visualization and some of the things which came to mind when he listened to/arranged/wrote the music.

After I walked him through the many options for generative/modular/parametric media native to MadMapper, I moved onto designing and programming the visualization.

Process:

Splitting the different compositional movements into individual pieces (defined by particular musical phrases or the predomination of a particular instrument, for example), I began to design visual compositions and record to Scenes in MadMapper. 

Over the course of the composition, I created nearly 300 quads/triangles/circles/surfaces in MadMapper. I used different media and applied different effects/blend modes to those surfaces or groups of surfaces to achieve the goal of creating different elements for each movement which seemed to "map" or follow the musical phrases or particular musical instruments.





After recording 25 scenes — some of which were 30 seconds, some of which were 2.5 seconds or instant transitions — I began the process of programming the show control in QLab.

The first step I took was to "slice" the audio file in QLab which made it extremely easy to get timecodes in a format compatible with QLab's auto-follow cueing functions. I divided the track up into sections where each one was equal to the length of one of the movements of the composition.


From that point, I tracked each MadMapper scene and scene transition based on the timestamp of the song when played as an audio cue within QLab. Each OSC cue fires a particular scene within MadMapper in time with the song and allows for fades to take place internally in MadMapper. However, I could have included the fades within QLab and not in MadMapper — either choice would have been fine.


This image displays all of the cues and control times on the right hand side.


This image displays some of the OSC controls as well as the way the show control operates when the song has been played/the MadMapper composition has began.


Interpretation:

I feel like I was able to augment the sonic achievement Trevor made with such a complex composition by creating what I think is a pretty complex MadMapper visualization. I learned a lot about using Mapper as an animation tool and have become pretty familiar with the ins and outs of cueing when making changes to the built in media as well as to the geometry of surfaces.

As far as the show control, I am not sure if the array of auto-follows in QLab was the most intelligent way to time each cue's action. It definitely worked. However, it was challenging to make changes to the timing of an individual cue because it caused a potential need for change in the timing of cues that fell after the initial changed cue.

Nonetheless, I'm happy with the composition and am glad to support a talented artist with techniques developed over the course of this semester.

No comments:

Post a Comment