Monday, March 19, 2018

Will Russek - Assignment III - Classroom Mapping


KALEIDOSCOPE


Concept

With this project, I had originally wanted to continue working with MIDI controls to affect parameters and visuals through MadMapper. My first thought seemed a bit ambitious, but I was excited to attempt to work it out. Since each of the dual projector setups were run on four separate computers, I needed to find a system that would allow them to sync MIDI with each other. Initially I felt that a hardware MIDI interface would solve the problem, but that would only be useful for sending a handful of MIDI data to a single computer. I looked into syncing the computers via a Virtual MIDI Network, seeing as Ableton Live 10 allows you to host a single Live session on multiple systems via Virtual Networks (Wi-Fi). The bump that I ran into with this technique was simple, we are running 3 Windows and 1 Mac. While the Mac has a driver for Virtual Networking, apparently most Windows systems do not. After learning this, I decided to find another way around creating this project.

Instead of starting with the technical aspect, I decided to start this project thinking of what I wanted to portray creatively. With 8 projectors washing the entire length of the room, I wanted to create something environmental, and heavily textured. 

Process

I remembered that I had great 1080p drone footage given to me by a friend that he had bought for a music video. Bird's eye view video of beautiful oceans and forests with lots of rich greens and deep orange sunsets. 



I wanted to make this footage more rhythmic and a different balance between the importance of texture and color. Keeping the texture of trees and rocks was important, but I also wanted to make it more about the natural colors.

Taking the footage into Premiere Pro, I scaled the footage down and rotated it in a radial motion around the center. The amount of sections varies from clip to clip, but I used anywhere from 3 to 24 sections of the same video repeated.



To 'soft-sync' all of the footage, I rendered my final composition four separate times. Since the visuals are set to music, I let the beginning of each render have 4 bars notice to start the next station (Station 2, which runs the music, has 16bars notice, Station 1 has 12bars, Station 3 has 8bars, and Station 4 has 4bars). This allows me to press play with slightly more accuracy than quantifying my start times with seconds.

One thing that irritated me about working in Premiere Pro, is the lack of grids that are possible within the timeline. Since everything is in sync to the amount of video frames (30fps), it is impossible to tweak the video by BPM (beats per minute). My only solution was to use the closest frame marker as possible, which throws off some of the video renders by roughly 5 milliseconds.


The music is a remix I am putting together.




No comments:

Post a Comment