Alexander Refsum Jensenius é músico e pesquisador na área de Embodied Music Cognition e de Novas Interfaces para Expressão Musical. É diretor do Departamento de Musicologia da Universidade de Oslo, Noruega, onde também leciona. Com uma formação que perpassa áreas como informática, matemática, musicologia, piano e tecnologias para música, é membro ativo da comunidade acadêmica de Tecnologias para música, sendo um dos principais organizadores do NIME 2011, na Noruega. Ocasionalmente, também se apresenta utilizando instrumentos eletrônicos e piano. Faz parte da Oslo Laptop Oscherstra (OLO) e da Oslo Mobile Orchestra (OMO). Mais sobre suas pesquisas pode ser encontrado no seu site.
What is your design process for new interfaces for musical expression (NIME)? Do other people take part in this process? What are their roles?
It differs. For some of my NIMEs I have started out with a set of constraints. The CheapStick was a pre-Arduino type of instrument proving that it is, indeed, possible to create very cheap functioning instruments (Jensenius CMMR 2006). So the design process was really about trying to overcome these limitations with what we had available.
The Music Balls were built as a series of instruments starting from a conceptual idea: instruments that should be simple to use, soft to touch, ball-like in shape, and with no visible electronics. These NIMEs are still traditional in that they are built as a separate, physical controller and a sound engine, with fairly simple mappings between the two.
Most of my recent work has involved using full-body movement to control musical sound. Here the conceptual idea has remained constant, while the physical implementation has changed from piece to piece. I have been using many different types of motion tracking technologies (regular and infrared cameras, inertial sensors, etc.), but always with the core idea of translating body movement into musical sound in different ways. Obviously, the technologies at hand will also influence the final outcome, but I think I have succeeded fairly well in creating a series of “invisible” NIMEs. Even though there is no physical controllers in these instruments, the challenge of creating mappings between motion and sound remains.
[Regarding colaboration] Yes, I often collaborate with others. Many of my NIMEs I have built together with students, but also together with artists and musicians I work with.
In the NIME project at the Norwegian Academy of Music, for example, I worked together with percussionist Kjell Tore Innervik and composer Ivar Frounberg, as well as some professional glass experts, to create a series of glass instruments. Here we had a fairly traditional division of labour, in which the glass blowers made the physical instruments, I developed the electronics, Ivar composed and Kjell Tore performed.
During my collaboration with violinist Victoria Johnson for the piece Transformation, we worked very closely together both in designing the physical setup as well as programming the sound engine and interaction, and creating the piece. So even though I did most of the programming and she did the performance, it was very much a collaborative effort from beginning to end.
That is also the case for the micro-movement project Sverm, in which I worked very closely together with two musicians, two dancers and a light designer. Here we can really talk about creating a meta-NIME, since the performance space itself, also became the “instrument” that we performed in/with (mainly by standing still).
Also through my teaching, I see that many of my own ideas inspire students to create NIMEs that they certainly take out of academia and into their regular artistic activities. That way, my own academic work helps feed the underground music scene, and may, hopefully, also lead to some commercial devices at some point in the future.