Transforming Design from Visual Thinking to Sensory Composition

Design has advanced in every sensory domain, yet the tools we use still isolate vision, sound, motion and scent into separate workflows. Human perception does not. Perception integrates sensory signals into a single state within milliseconds, weighting them by intensity, timing, expectation and context.

This mismatch has shaped the entire history of experience design: we can design individual modalities, but we lack a shared system for composing them together. There is no common structure that allows a visual event, a motion cue, a sound pattern and an olfactory signal to be authored with one logic, aligned across time and intensity.

The result is a field that depends on intuition, metaphor or subjective interpretation. Designers construct multisensory experiences without the equivalent of a notation system, a translation framework, or a method to express how modalities influence one another. The absence of a unified language is the core bottleneck that limits multisensory design. 

The Sensory Code™ addresses this gap. [1]
Composing Perception: 
The New Grammar of Sensory Design

Developed by designer and researcher Carolin Vedder, The Sensory Code™ introduces a paradigm shift away from modality-specific design towards a unified framework for sensory composition. It addresses a long-standing problem: while neuroscience confirms that human perception is inherently multimodal, our design tools are not. 

We can specify a precise color with RGB values and a musical event with MIDI data, but we have lacked any equivalent system to define a relationship between them.The Sensory Code™ provides this missing link, proposing a unified representation for multisensory states. It acts as a foundational grammar that allows designers to author a complete sensory experience, where vision, sound, motion, and even scent are coordinated through a single, explicit logic. 
[2][3]
Poster published 2025 through REACT Vienna and Springer
The thinking behind this innovation is both pragmatic and profound. It reframes perception itself as a composable medium. 

The model’s foundation, the Multimodal Translation Model, establishes measurable relationships between senses by anchoring them to shared perceptual properties: intensity, duration, temporal dynamics, and spatial characteristics. A sharp, bright visual flash can be systematically mapped to a high-frequency auditory pulse or a brief, intense olfactory burst because they share underlying patterns of intensity and velocity. This allows for a new level of creative and functional expression through four core operations: binding multiple sensory channels, substituting one for another while maintaining perceptual meaning, modulating their intensity in concert, and sequencing their interactions over time.

This framework moves design beyond aesthetics into the realm of structured, perceptual engineering. It provides a system that is not only creative but also auditable, comparable, and governed by clear rules. The implications of this are vast. It enables the future of truly adaptive and accessible design. An interface could substitute visual cues with tactile patterns for a visually impaired user with verifiable consistency. A brand’s identity could be expressed coherently across packaging, sound design, and the scent of a retail space. Digital and physical environments could be orchestrated to guide attention, modulate cognitive load, and evoke specific emotional states with precision.

The Sensory Code™ is a new way of thinking. It provides the notation for a future where designers are composers of perception, orchestrating entire sensory worlds with intent and precision. As it evolves from a complete research framework into a published crossmodal translation language, a taught methodology, and a digital toolset, it promises to equip a new generation of creators with the language to design not just for the eyes or the ears, but for the integrated human sensorium. It is a necessary step towards a future where the design of experience is as rich and interconnected as perception itself.

The Sensory Code™ creates:
Standardized packets for perceptual information
Interoperability between devices/software
Human-readable (.senscx) and machine-optimized (.sensc) formats
Timeline sequencing (.sensq for sequences)
Short codes like hex codes

In Production:
The Sensory Code™ Picker (like color picker)
Converter tools (.sensc ↔ .senscx)
Visual editor/playground
Tools & Tutorials
Workshops

[2] react 2025 (link follows)
[3] Springer 2025 (link follows)

@2025 CAROLIN VEDDER​​​​​​​
For questions, collaborations feel free to reach out.
Submit
Thank you!

You may also like

Back to Top