March 2026

0-2-1

0-2-1 marks the 1.0 release of Orphic-FM. The big architectural shift: a C++ thin audio bus that wraps the Mutable Instruments OSS DSP code directly, replacing the previous Kotlin-only signal path. This unlocks native performance on all platforms (Android, iOS, Desktop) and makes it trivial for Claude to wire up new instruments against the MI engines.

Track 3

Itchy Scratchy

Itchy Scratchy puts the new DJ turntable front and center. The C++ DSP layer captures audio into a circular buffer and plays it back with cubic Hermite interpolation, giving scratches that warm, vinyl-like pitch response. Each deck can tap any source — Synth, Drums, Bass, or Master — and a constant-power crossfader blends between them without volume dips at center. On the Kotlin side, an MVI ViewModel drives a 60Hz physics simulation for platter momentum, while the Compose UI renders radial waveform platters and a vertical crossfader in the Cleveland Guardians palette.

View commit

New dual turntable module with scratch physics and crossfade bring DJ-style performance to the synth engine.

#human #dj #turntable #scratch
Track 4

Lazy Susan

Lazy Susan showcases the new Horn (Leslie rotary speaker) DSP effect. The C++ engine splits the input through a Linkwitz-Riley crossover into separate horn and drum rotor channels, each spinning with independent inertia to model the real-world lag of heavy rotating elements. Stereo amplitude modulation recreates the Doppler-like swirl that defines the Leslie sound. On the Kotlin side, a physics-based animation drives the rotor visuals in the HornPanel, with tactile controls for Speed, Ratio, Depth, Amount, Mix, and Brake. Rotor phase and audio peaks feed back into the engine’s visualization ring buffers, so you can watch the rotors spin in sync with what you hear.

View commit

The Leslie Horn Panel adds chorus you can feel with your ears. Headphones on for this one.

#human #leslie #cowbell
Track 2

Insight

Insight showcases the new portamento and legato behavior in the bass unit. A three-tier gate system now classifies each step: rest (≤0.3), slide (0.3–0.7), or normal trigger (>0.7). During slide steps, pitch smoothing (portamento) glides between notes with the glide time controlled by the envelope parameter. The envelope generator was updated to hold a sustain floor while the gate stays open, enabling smooth legato transitions without retriggering. The trigger logic skips retriggering entirely on slide steps, allowing the signal to flow continuously — the bass line breathes instead of stuttering between notes.

View commit

Sliding into the groove. Portamento and legato bring the bass to life with smooth pitch transitions and sustained expression.

#human #bass #portamento #bugs
Track 1

WYSIWYG

WYSIWYG is the first track produced entirely on the new C++ audio bus — the result of a massive engine migration that removed JSyn and all Kotlin DSP infrastructure, deleting ~28,000 lines of code across 214 files. Every plugin was stripped down to a pure state container that forwards parameters to C++ via NativeDspBridge, unifying Desktop (JNI + miniaudio), Android (Oboe), and WASM (Emscripten Worker) under a single native engine. The hard-clipped master output was replaced with tanh() soft saturation, letting the Bassline funk and PolyLFO orchestra stack without digital distortion. The signal visualization reads directly from the C++ peak monitor flow, giving a true picture of what the sound actually looks like — what you see is what you get.

View commit

Seeing is believing. A Bassline funk combined with a PolyLFO orchestra produces sonic bliss.

#human #bass #signal-viz #lfo
February 2026

Crossing the Chasm

Crossing the Chasm was feature-driven – still Kotlin, but with a new focus on front-end architecture and shipping as many features as possible. Self-registering UI panels, Panel Sets, the full Plaits engine suite, Dattorro plate reverb, speech synthesis, the Marbles-based Flux module, MediaPipe hand tracking, and ASL Maestro Mode all landed during this album. The codebase went from prototype to something resembling a real product.

Track 5

The Balch Hotel

Written at the Balch Hotel in Dufur, Oregon with Mt. Hood visible in the background. The day before (February 27th), the Mt. Hood lava visualization was added to the synth and rendering performance was optimized. On recording day, control event handling was unified via setPluginControl. The cozy and charming 1907 hotel room, watching the sunset behind Mt. Hood in the window with my fabulous wife Benedicte, was captured perfectly in this visualization and song.

View commit

This was written in Dufur, OR at the fabulous Balch Hotel with Mt. Hood in the background.

#human #strings #bender #MtHood
Track 4

Goodnight Brasi

This track showcases 4 new oscillators and the Bender toy on Desktop. The song was inspired by the big Block news that day, which left the community reeling: February 26th. A Aquarium visualization was added to the synth along with enhanced AI agent integration. The aquarium for some reason reminded of The GodFather. I experimented with an AR but so far that is not working as expected. The gesture control is still a huge WIP as it easily get confused on M,S and H.

View commit

This song features 4 new oscillators and a new Bender toy on Desktop. Orpheus sleeps above the fishes.

#human #bender #strings #looper #fishes
Track 3

Handwavey

The first track performed using Maestro Mode – conducting the synth with hand gestures via MediaPipe. Camera-based hand tracking was added on February 21st, along with CPU-saving inactive DSP plugin disabling. On recording day (Feb 22nd), Maestro Mode was reworked to use individual voice gating with Thumbs-Up hold control, ASL gesture control routing was made context-dependent, and state snapshotting was centralized for REPL and gestures. The core:foundation module had also been decomposed into specialized modules with a JVM 21 upgrade just two days prior. The aspect ratio (1006/1080) captures the portrait-mode camera view of the hand tracking in action.

View commit

MediaPipe + ML + ASL = Maestro Mode. In this master-POS, Orpheus is accompanied by the Ant-Band and conducted for the first time using the new ASL Maestro Feature. This song is sponsored by the Terro.

#human #maestro #strings #Ant-Band
Track 2

Piece of Ice Cream Cake

A Valentine’s Day session with TTS and the brand-new Flux module. On February 13th, the full Marbles engine was implemented in Flux, self-registering UI panels were introduced (decoupling UI from layout), the Speech Panel got an animated multi-phase readout with avatar animation, and Panel Sets were added for UI layout management. On recording day (Feb 14th), the codebase saw keyboard input decoupled from ViewModels, SharingStarted strategy made configurable, and feature documentation centralized into SynthFeature. The front-end architecture was maturing fast.

View commit

Orpheus was feeling a little Pink on Valentines day and decided to chill with a piece of Ice Cream Cake.

#human #tts #flux #delay #reverb
Track 1

6-7

Orpheus speaks for the first time. The Speech Synthesis Engine and TTS Player were committed on February 8th – the same day this track was recorded. The day before saw a massive Plaits engine integration: five new physical modeling and granular synth engines, complete gain staging, percussive mode, audio-rate modulation, and the Dattorro plate reverb (ported from Mutable Instruments Rings). The reverb and delay atmosphere in this track are the brand-new effects getting their first real workout. Also notable: audio thread heap allocations were eliminated that same day, a critical performance fix.

View commit

Orpheus utters his first words, using his new Plaits-inspired speech synthesis module. A healthy dose of reverb with a touch of delay and lfo provide the atmosphere for this new timeless classic fad hit.

#human #speech #reverb #delay #lfo
February 2026

Bootstrap

Bootstrap was cut on a raw prototype – all Kotlin, all the time. The synth ran on JSyn for desktop audio, had 8 oscillators, a Rings-inspired resonator, an 808-style drum machine, and a brand-new AI agent that could barely hold a tune. CPU was slow, the architecture was monolithic, and every recording session doubled as a stress test. By the end of this album the Ports DSL, plugin system, and multi-model AI support had all taken shape.

Track 1

Encapsulation

This track was recorded in the middle of the biggest architecture overhaul of the Bootstrap era. The Ports DSL – a type-safe nested DSL for defining DSP port connections – was taking shape, along with PortRegistry for unified plugin parameter management. The Duo LFO got speed multipliers, the ViewModel layer was overhauled with SynthController controlFlow, and drum synthesis was abstracted using Plaits engines. The resonator and LFO you hear in this track were literally being refactored as it was being recorded.

View commit

Melodic buildup to sweet droney howl aided by the clang of the resonator while being guided by a tasteful amount of feedback and lfo.

#human #drums #drone #lfo #resonator
Track 5

Welcome to ML

A fully AI-generated drone – Gemini Flash 3.0 with no human intervention. On January 25th, the Looper and Resonator panels were integrated into the Desktop UI, Warps routing was changed to use pre/post-master effects sends, the evolution prompt system got its first-prompt fix (ensuring the AI started generating immediately), and LiveCode was encapsulated into its own panel. The grinding, metallic sound comes from the AI pushing the Warps meta-modulator to extremes the routing wasn’t originally designed for.

View commit

Grinding, Metallic, Bending drone generated entirely by Orpheus using Gemini Flash 3.0

#ai #flash3 #drone #machine
Track 3

AGP 9.0

A celebration song for the Android Gradle Plugin 9.0 migration – hence the name and the @Preview annotation in the description. On January 24th, the Gradle wrapper was upgraded to 9.3.0, DSP plugin access was centralized, and the Matrix meta-modulator (based on Mutable Instruments Warps) got Duo LFO modulation. The live debug log view also landed that day, which was immediately useful for monitoring the CPU-hungry desktop build.

View commit

@Preview: Celebration song for the Gradle Plugin migration

#human #drums #organ #pitch #cowbell #fm #drone
Track 4

Reprise Surprise

One of the earliest AI-composed tracks. The day before, AI feature code was consolidated into commonMain (enabling Kotlin Multiplatform AI support) and the Sonnet model was added alongside centralized model definitions. Gemini Flash 3.0 drove this composition, but the heavy drum backing exposed the limits of the AI’s mix sensibility at this stage.

View commit

Gemini Flash 3.0 with a (too) heavy drum backing

#ai #flash3 #drums #swirl
Track 2

Daisy 9000

The first-ever recording on the Orphic-FM synth. On January 11th alone there were 16 commits – AI agent hardening, multi-provider support (OpenAI and Anthropic Haiku3), MIDI learn, generic control routing for Tidal and AI agents, and the drum machine mix control. The 808-style drum synth and physical modeling resonator (based on Mutable Instruments Rings) had landed just the day before. Everything was raw and untested – this recording was as much a shakedown cruise as a composition.

View commit

First track recorded on the Orphic-FM Synth. This manual composition was inspired by HAL 9000 as means to show AI how to express itself through music.

#human #drums #drone