Beacon Tech Specs for Composers
Some info to get you started in imaging what the Beacon can do
Hardware/Software
Horn: I primarily play on a plastic model trumpet (pTrumpet), with a stable prototype for standard brass model horn coming soon. There is no sound difference once processing is involved. The pTrumpet does not have a 3rd valve kick slide so tuning the low C# and D is done manually with lip control. No other mutes can be used inside the trumpet bell.
Mute/pickup: Yamaha Silent Brass (SB7J) — while the mute causes some marginal timbre compression, and the horn becomes more challenging to play with full control (tuning, general output), the freedom of motion the Silent Brass mute offers, the ease in which a brace design holds it to the horn even against the most violent of physical gestures (compared to a clip on mic), and its ability to 100% prevent any kind of feedback, is worth the trade-off.
Sensor: Arduino Nano 33 BLE Sense Rev2 — includes accelerometer, magnetometer, gyroscope, color sensor, light sensor, proxy sensor, gesture sensor, a microphone, barometric pressure sensor, temperature sensor, humidity sensor, and TinyML (AI)
Software: Max 9 — I have pre-built patches but you can design your own.
Gestures
Fast whip to the left, or to the right (instant threshold triggering)
Difficult to maintain face contact
Fast ship down—valves facing forward, bell facing floor (instant threshold triggering)
Cannot be done with ANY face contact
Fast swing UP is theoretically possible, but a pure 90 degree angle is impossible and the rotation of the wrists makes it difficult to comfortably reach the same speed threshold as a downward whip.
Rotation (with horn facing forward) (gradational/controllable data)
Can rotate up to 85 degrees clockwise, 75 degrees counterclockwise
Becomes more difficult to maintain face contact beyond 45 degrees either direction
Aiming upward or downward (gradational/controllable data)
Can aim at 90 degrees straight up or down
Difficult to play with control past 75 degrees
Very difficult to engage comfortably with front-of-horn sensors past 45 degrees UP (extending the arm to cover sensors is hard, and puts more pressure on right hand (thumb and pinky) to maintain full control over trumpet, making valve interaction difficult.
Covering the color/light/proxy sensor (gradational-to-instant controllable data to threshold triggering)
Plunger mute is my tool of choice, but anything easy to hold in the left hand works
Can also aim horn directly at on-stage objects or move towards/away from them — magnetometer can also be used to this end
*Front facing sensors should be designed with performance space variability in mind (e.g. light triggering w/ a glove-mounted LED works in a dark room, but not in a well lit space)
Many more complex actions coming soon, along with video documentation of everything…
        
        
      
    
    Sound Processing, Effects, Design Potential
All sensor data can be read by Max 9 and has been calibrated and stabilized in advance (including trigger thresholds), but these can be adjusted. The infrastructure patch can be provided, but without a Beacon for testing, you will likely need some back and forth from me, which I am happy to do! A cheap and easy way to test gesture triggering at home is to purchase an Arduino Nano Sense and connect it with physical cables to your computer. I can provide the necessary code if needed, and can help you orient it in real space to simulate the Beacon minus a trumpet. If you are new to Arduino, I will help—even with no coding background I’ll make it easy for you to get set up! If you are new to MaxMSP, I recommend practicing some basics before designing for the Beacon as your first project, as the gesture-triggering aspect may be a major roadblock in the more typical Max learning pattern of quick prototyping with instant feedback.
You can include or bypass the dry trumpet input from any effect (with any graduation in between).
There are countless effects you can build that map cleanly to various gestures. Some can be sonic-forward, some can rely more on the motion of the Beacon as a physical controller. You can also interact with multimedia, other performers, or the stage itself. Anything that can interface with Max is on the menu. A starter list of processing effects and interdisciplinary possibilities could include:
pitch shifting
ADSR manipulation
delay looping, pitch storing
spectral freezing
use of left hand to trigger a front-facing sensor
fixed media triggering/interaction
reactive interaction with live video, generative visuals, stage lighting, etc
ability to process other instruments
space interactivity (w/ magnetometer/stage design)
spatial/immersive audio design
live DJing
Some limitations and design elements to consider:
The Bluetooth Low Energy connection to laptop hosting Max has a limited range (approx. the same as your Airpods to your laptop)
There is some latency to creatively manage
I suggest thinking about how long you want effects to last, and if you want a way to shut them off.
Lastly, consider writing for the Beacon the way you would write for harp, or writing choreography. Every musical idea and gesture should be not only “danced out loud,” to understand timings, physicality (where is the player from moment to moment), but I suggest sketching a detailed flow of sequential or overlapping effects (a la harp pedaling) to make sure nothing impossible comes up.
Set your expectations for Beacon design to involve constant collaborative—we will make your ideas work!