Axel investigates scientific-artistic sound installations that connect man and machine through sound, movement and neuromorphic technology. He explores the dialogue between man and machine with their brains working similarly thanks to neuromorphic computing. The perception of the visitor by the machine, and of the machine by the visitor are both based on attention and neural communication. Both influence the soundscape in an audiovisual feedback loop and this dialogue creates unique atmospheres. The visual output is created from the artificial brain activity and blurs the technology into sensations.
For the first time, motion tracking is performed using neuromorphic computing, an emerging AI technology inspired by the human brain, simulating the very details of biological neurons, in contrast to conventional AI. This makes the machine very similar to humans.
Neuromorphic event-based cameras transmit the signatures of movements rather than entire images. The person is de facto unrecognizable, only the traces of their movements are.
The network of spiking neurons runs on a neuromorphic chip that mimics a biological brain. Our goal is to confuse the visitor by showing this neural similarity between them and the machine, via sound and light feedback, by visually representing the activity of the machine's neurons in an abstract way and blurring the technological separation.
Another special feature of neuromorphic technology is its very low energy consumption. It is also referred to as "Green AI". The neuromorphic chip is presented in such a way that it takes center stage. Its small dimensions and low energy consumption embody the frugality of green artificial intelligence.
The projected visual feedback is the direct output of the camera (blue and green shape edges), plus the “heatmap” produced by the spiking CNN based neural network.
The LED installation reacts to the heatmap values of active joints, which also trigger sound clicks, enabling the visitor to hear and see neuron spiking activity.
We use pose estimation with a neuromorphic cameras to calculate a localized skeleton representation of the visitor. The neuromorphic chip runs the encoder part of the AI algorithm, which generates very sparse data and uses little energy. The visitor’s body movements as well as their localization and pose are tracked and control changes in soundscapes. The installation has a dual purpose. It exposes the internals of the neuromorphic brain’s activity in a pure yet not obvious visual feedback. It also revisits the body controlled sound stylistic exercise with a more direct dialogue with the machine and with richer soundscapes.
TONUS publication: Neuromorphic human pose estimation for artistic sound co-creation
The TONUS installation proposes different sound atmospheres. They features self recorded, royaltee free and licensed samples. Through their body movements, depending on the soundscape, visitors control sample loops, effects, pitch or volume. They discover their "super powers" progressively by interacting with the machine and observing both audio and visual feedback. The audio tracks below have been recorded from an interactive session at the Deutsches Museum, July 2024. In every track, you can hear clicks that repond to the visitor's body activity, in sync with the LED flashing feedback. In some tracks, you hear white noise at start: this noise vanishes when the visitor enter the room and places at the optimal location. It is meant as a guidance, but also a acknowledgment from the machine that it senses the visitor.
Our highlight soundscape, from Stefan Bodzin's excellent Boavista sample set (authorized). A closed feedback between the machine and the visitor creates a techno loop. Visitor’s arm movements trigger new sample loops or generate effects, the level of activity raises percussions. The techno loop is built step by step in a more and more exciting sound.
This sound experience dives the visitor into a nostalgic atmosphere,. Time is the binding factor between the visitor and the neuromorphic algorithm, which processes time sequences. The famous song from Jeanne Moreau is sung a capella by my wife Anais Cousin, while sounds of pleasant remembrances pop up with the visitor’s arm and leg movements.
Here the connection between the visitor and the neuromorphic algorithm is direct. The visitor’s limb movements are transformed by the algorithm into sounds, creating a unique dialogue. Clicks triggered by the neural processing of body activity are very present here, even increasing the physical link between visitor and machine.
A multitrack song can be composed by the visitor triggering instruments by moving their arms and legs: guitar, drums, saxophone... The visitor plays with the dynamics of this songs as if they were an orchestra director.
TONUS was presented in July 2024 for four days at the prestigious Deutsches Museum in Munich during the Festival der Zukunft (Festival of the Future) 2024. TONUS was prominently installed in the artistic part of the Experience Area, together with renowned interactive media artists.
TONUS was also exhibited at the IJCNN conference in July 2025 in Rome, at the magnificent Gregorian University. The audience, mainly scientific and international, with a few artistic profiles, responded with great enthusiasm to the installation. The realization that the system responds to the visitor's movements is not always immediate, as we let the visitor discover it for themselves. This makes the moment of realization all the more exciting, and the reactions are very emotional and joyful.
A scientific paper describing the neuromorphic pose estimation that powers the installation, has been presented at IJCNN 2025 in the grand auditorium, during the special track on Human-Ai Interaction In Creative Arts and Sciences.
Check out our paper: TONUS: Neuromorphic human pose estimation for artistic sound co-creation.
Axel von Arnim studied engineering and computer science in France from 1994 to 2000. He obtained two master's degrees in these subjects. In 2024, he obtained his doctorate in electronics, about optical communication. Since 2020, he leads the neuromorphic computing research group at the fortiss research institute in Munich. He has a passion for real-world applications of science and is recognized for his creativity in this regard.
He has a classical piano education (12 years of private lessons with, among others, Mrs. Brigitte Bouchateau-Guillard, who was also Alexandre Tharaud's teacher). He has also been connected to the art scene for many years through his wife Anais Cousin, who studied painting at the Academy of Fine Arts in Munich and is now an artist. On top of playing the piano and signing in chamber choirs, he directed a chamber choir project for baroque spiritual music in 2018 (see Soundcloud link at the end of the document).
He has long been interested in the artistic application of technology, particularly in interactive design installations, e.g. by the agencies Art+Com, and in the visual arts.
Axel von Arnim: concept, scientific and artistic direction, visual and sound design, audio programming
Jules Lecomte: movement tracking, neuromorphic vision, integration Jules Lecomte studied computer science in Paris and Munich and has been a researcher in the fortiss Neuromorphic Computing group. He is a trained gymnast with a passion for movement and body control.
Konrad Zinner: sound design, audio programming, visual programming Konrad Zinner studied music and composition at the Hochschule für Musik und Theater in Munich. He is a tech addict and has a passion for combining music, sensors and computing into live experiences.