M1D1 is a project I’ve been working on in my free time over the past year. It is powered by an Arduino DUE microcontroller and programmed in Arduino C++. It takes some of the core code library I developed for my Music Spiral project, and greatly expands it with a deep implementation of music theory, new control and output interfaces and some machine learning capabilities.
From an animatronic and visualization standpoint, it incorporates:
1. An Anthropomorphic Robot Head
About 8 inches high, built from custom 3D modeled and printed parts, aluminum, and electronics parts.
LED Lights: The robot head has around ninety RGB addressable LED lights in its cranium, eyes, mouth and internally to light up fiber optics. The LEDs are programmed to respond dynamically to associated chromatic note mappings, over multiple octaves, and from multiple devices, with a variety of control selectable rendering modes which alter how it operates. There are modes to color map different music scales/keys, color map different instruments with separate colors, color map based on synesthesia frequency relationships, etc…
Ultrasonic Mist: Hidden inside the head of the robot is a modeled reservoir that holds water, and a capillary action tube with sponge material. An ultrasonic mister is engaged by a variety of inputs (usually I map to MIDI channel pressure). When the user plays a key on the synth and applies aftertouch, the head generates ultrasonic mist, which comes out of the top of the head/cranium.
Servo Controlled Eyes: A servo mechanism connected to the eyes allows them to move back and forth to the beat.
2. A Chromatic, Archimedean Spiral Light Display
The body section is an Archimedean spiral measuring about 12 inches diameter.
LED Lights: The spiral has about 100 led lights embedded in its construction. They are all grouped chromatically in arrays and mapped to visualize over 5 octaves of MIDI musical notes. Various different display algorithms can be set like on the robot head and previous logarithmic music spiral project I built.
OLED Display: The OLED display above the body section renders out realtime, instantaneous information on what is being musically input and/or output from the M1D1 robot. It has an extensive music theory library built in, and when you high basic, or very complex chords, it instantly recognizes what chord is being played, along with inversions of chords, and displays that in the interface. Ie: if you hit a bunch of keys and don’t know what you would call that chord, it shows you “ie: A min 7 over E”. It does this even with multiple instruments being played simultaneously. For instance, if a bassist is playing a note, along with three notes from a keyboard or guitar midi system, it can categorize chords based on the combined musical signature of the instruments.
In addition, this is where data on the M1D1 active listening is shown. The M1D1 bot catalogs what is being played over time, and has some machine learning type of intelligence built in to determine overall key signatures for a song and key changes.
3. An Intelligent Bass Note Accompaniment System
The third element shown above is a control interface enclosure with Five Large Tactile Faders, Three Potentiometer Knobs, and Two Buttons.
MIDI Output: A MIDI bassline accompaniment is output in real time.
Control Interface / Mixer: The various faders and knobs allow the user to control how the bass accompaniment is played, stylistically, and controlling overall intensity, dynamics and density of notes. It can also lock into an incoming MIDI drum groove.
Footswitches: In addition, regular ¼ footswitches can be plugged in, offering extra controls, including the ability to tap in and out to teach M1D1 a new progression / groove. A recorded groove is automatically normalized to the key signature its played in, and transposed appropriately as the player makes music. The recorded groove can also be density mixed, or quantized to certain scales.
YouTube Video Demo of the M1D1 Head Functionality with Sequential Take 5 Synth:
For more information contact:
Jason Cooper - jason@jasoncooper.com