Music that listens before it plays
Omix runs a real-time adaptive audio engine built in Rust. It watches how you work, scores your focus level, and reshapes every layer of the music to match, without ever sending data off your machine.
Activity Detection
Omix listens to system-level keyboard and mouse events to understand how you're working. But raw event counting isn't enough. Holding down the spacebar looks very different from actually typing a sentence.
The engine tracks keystroke variety: it needs to see multiple distinct keys in a short window before it counts as real typing. This means holding a single key or tapping a modifier doesn't register as productive input. Mouse movement requires actual displacement, and scroll events are throttled to avoid inflating activity during casual browsing.
The result is a clean activity signal that distinguishes between focused work and idle fidgeting, without recording what you actually type.
Productivity Scoring
Input signals and app context feed into a scoring engine that produces a single number: your current productivity level, from zero to one. This score drives everything downstream.
The engine uses scene-based hysteresis to keep transitions smooth. Rather than jumping between states, it moves through four natural phases (Idle, Warming Up, Flow, and Deep Focus) with minimum dwell times at each level. Rising into a higher state is fast (you earned it). Falling back is deliberately slow (a quick pause shouldn't kill your momentum).
There's also a focused reading detector: a state machine that recognizes when you're reading (scrolling with minimal keyboard activity) and holds the score steady instead of letting it decay. Reading is focus too.
Ambient Scenes
Running alongside the music engine is a separate ambient audio engine that layers environmental soundscapes underneath the music. Choose from six scenes: Mountain Rain, Forest Camp, Buzzy Cafe, Flowing Creek, Chalet Fireplace, and Spaceship Hum.
The ambient engine uses gapless looping and smooth crossfades when you switch scenes, so there's never a jarring cut. Like the music, ambient volume is driven by the orchestrator. When you're idle, ambient sounds are at their fullest, creating a calm, enveloping backdrop. As your productivity rises, they fade back to let the music take over.
The Neuro Layer
Underneath the music, Omix generates subtle binaural beats. These are slightly different frequencies in each ear that produce a perceived pulsing tone. Research suggests these can help guide brainwave entrainment toward states associated with focus and calm.
The neuro engine is adaptive too. The carrier frequency, beat rate, and volume all shift based on your productivity level. During idle states, slower beat frequencies encourage a relaxed baseline. As you enter flow, the frequency rises to support sustained attention.
A follow system monitors the volume of the main music bus and adjusts the binaural layer to stay perceptible but never dominant. It sits below conscious awareness, like a subtle guide rather than a distraction.
In December 2025, Stanford's SHAPE Lab published research showing that real-time audio feedback anchors attention more effectively than static sound, with particularly strong results for people with ADHD. Omix's neuro layer works on this same principle: continuous, responsive feedback rather than a fixed signal.
Hear the difference
Try Omix free for 7 days. No credit card required.
Mac & Windows.