<music: electro-acoustic & algorithmic>

View Music Overview

"My interest in algorithmic composition, and electronic music in general, came as an outgrowth of my interests in recording a producing popular music. I am not interested in replacing performers of traditional acoustic instruments with electronics, but, instead, writing music in which musicians do what they do best, and computers do what they do best. Software systems and scores are generally published through my company Clear Blue Media and registered with ASCAP, with source code components contributed to my EAMIR project through the EAMIR SDK." ~V.J.

electroacoustic & algorithmic

View More Electroacoustic & Algorithmic Works

featured works:

delayed to rest: for guitar and computer

"Delayed to Rest" is a work for solo electric guitar and a computer running custom software written for this piece. As the guitarist performs, the computer takes the live sound of the guitar, heard through the center speaker, delays its output by a few beats, then plays it through the two side speakers. This produces and echoing effect. The performer uses a footswitch to tell the computer how many beats to delay the sound, and to control other delay-based processes. Every sound heard originates in real-time from what the guitarist performs live; nothing is prerecorded, sampled, or synthesized. Additionally, nothing that the computer does is random; it uses specific delay times, rhythms, and signal levels throughout the piece according to the score.

DOWNLOAD Standalone App

Score    |    Source Code      |    Max for Live version  

Source code requires Max/MSP/Jitter 6 or later and the EAMIR SDK

disappearing.god.trick: for solo voice

"This system is designed to take real-time user input through a microphone and process it in novel ways. The user needs only to open the program and begin speaking. The spoken voice is recorded and analyzed using an FFT, and the spectral data is converted into a 3D matrix using a technique by my former teacher, Luke Dubois. This software is based heavily on his model. The 3D matrix is then scanned to resynthesize the original spoken sample. The novelty of resynthesizing from an image is that visual effects can be applied which change its sonic properties. Over time, the original sample will change substantially--there are no other sound sources or effects introduced in this system: everything heard is derived from the original sample. In this piece, the user says "God" one time. It is then processed according to a deterministic processing score that the software follows; nothing is random. "

DOWNLOAD

Source Code

Source code requires Max/MSP/Jitter 5 or later and the EAMIR SDK

discourse: for Eb clarinet & computer

"Discourse is a piece for Clarinet in Eb and interactive music system. The original software, which was also written by the composer, takes the clarinet signal as input and processes the sound in real-time. All of the processing happens live, so there are no prerecorded sounds in the piece. In other words, the computer's performance, including its timbre, is data-driven by the clarinet's performance. The clarinet begins a conversation with the computer which responds by imitating the clarinet and expanding upon its ideas. These two voices continue their dialogue: formulating responses to the other's last statement, pausing to listen to each other, and even stuttering at times while trying to answer appropriately."

requires a microphone (and a clarinetist, of course)

DOWNLOAD

SCORE

Source Code

Source code requires Max/MSP/Jitter 5 or later and the EAMIR SDK


nil: for guitar & computer

"nil is a composition for solo classical guitar and interactive music system. In this piece, the opening notes of the guitar performance are sampled by the computer and placed into a small buffer. From within the buffer, the audio sample is trimmed to remove any silence around the file. The audio sample is then manipulated to generate all of the computer sounds in the piece. As the performer continues to perform from a notated score, the computer begins manipulating qualities of the audio sample according to a score that it follows. The computer also processes the live input of the performer in a number of ways. All of the computer processing and sound generation for nil occurs in real-time and is driven by the performance of the guitarist; nothing is prerecorded."

requires a microphone (and a guitarist, of course)

Standard Versions

SCORE

Performance Optimized Versions

Note: this version of the software has some tweaks/features primarily suitable only for live performance such as auto-recording.

Source Code

Source code requires Max/MSP/Jitter 5 or later and the EAMIR SDK


squares: for control surface

"squares is an electro-acoustic work composed by V.J. Manzo. It was originally performed with the squares interactive music and live image processing system also created by Manzo. The system uses the Korg padKontrol’s 16 touch-sensitive pads to begin generating algorithmic composition processes when touched. The velocity of each pad sets the initial velocity of the process. The system takes live video input from two sources. In the original performance, two cameras were used: one on the performer’s hands, and one on his face. The number of pads currently held down relates to the number of rows and columns used to display the matrix input from the first camera. Each of these squares contains a scaled-down image of the overall matrix. The each square is tinted to resemble the second input matrix, the second camera. Each pad is also assigned a color that, when pressed, causes the matrix color to change. The performance begins (fades in) with the touch of a single pad. If no pads are being touched, the video fades to black (this is subsequently how the performance ends). The performer specifies an initial mode. The first knob at the top of the padKontrol controls the chord root and quality which drives the algorithmic composition processes. The second knob controls the change to another mode related to the initial mode sharing six of its seven pith classes. The padKontrol’s XY pad is used to play a monophonic melody in which the X value controls pitch classes of the specified mode, across a 2 octave range. The Y value controls velocity; a position held closer to the top will yield a higher velocity value. When there is no position held on the XY pad, the velocity will equal zero. This, therefore, yields a note-off state when the XY pad is not in use. A sustain mode can be enabled by pressing the HOLD button on the controller. This will cause notes played on the XY pad to sustain until another XY coordinate replaces it. The velocity (Y) will only change when a new note is triggered. This video was generated in real-time from within the software system and was not edited in any way."

software requires a Korg padKontrol & external synths

DOWNLOAD


Featured Music

Composition highlights from various music projects

Electronic Music

Algorithmic compositions, electro-acoustic works, and interactive music

Theatre & Film

All works for musical theatre, film, and other dramatic arts

Video Games & Multimedia

Compositions and sound design for video games & multimedia

Rock & Pop

Rock, pop, alternative rock, and progressive rock works written for bands and other popular music ensembles.

Complete Composed

All composed recorded works