Screen shot 2014-10-15 at 16.07.46

Alexander Refsum Jensenius é músico e pesquisador na área de Embodied Music Cognition e de Novas Interfaces para Expressão Musical. É diretor do Departamento de Musicologia da Universidade de Oslo, Noruega, onde também leciona. Com uma formação que perpassa áreas como informática, matemática, musicologia, piano e tecnologias para música, é membro ativo da comunidade acadêmica de Tecnologias para música, sendo um dos principais organizadores do NIME 2011, na Noruega. Ocasionalmente, também se apresenta utilizando instrumentos eletrônicos e piano. Faz parte da Oslo Laptop Oscherstra (OLO) e da Oslo Mobile Orchestra (OMO). Mais sobre suas pesquisas pode ser encontrado no seu site


What is your design process for new interfaces for musical expression (NIME)? Do other people take part in this process? What are their roles?

It differs. For some of my NIMEs I have started out with a set of constraints. The CheapStick was a pre-Arduino type of instrument proving that it is, indeed, possible to create very cheap functioning instruments (Jensenius CMMR 2006). So the design process was really about trying to overcome these limitations with what we had available.
http://www.arj.no/2006/06/01/building-low-cost-music-controllers/

The Music Balls were built as a series of instruments starting from a conceptual idea: instruments that should be simple to use, soft to touch, ball-like in shape, and with no visible electronics. These NIMEs are still traditional in that they are built as a separate, physical controller and a sound engine, with fairly simple mappings between the two.
http://www.arj.no/2012/05/23/music-balls-at-nime-2012/

Most of my recent work has involved using full-body movement to control musical sound. Here the conceptual idea has remained constant, while the physical implementation has changed from piece to piece. I have been using many different types of motion tracking technologies (regular and infrared cameras, inertial sensors, etc.), but always with the core idea of translating body movement into musical sound in different ways. Obviously, the technologies at hand will also influence the final outcome, but I think I have succeeded fairly well in creating a series of “invisible” NIMEs. Even though there is no physical controllers in these instruments, the challenge of creating mappings between motion and sound remains.

[Regarding colaboration] Yes, I often collaborate with others. Many of my NIMEs I have built together with students, but also together with artists and musicians I work with.

In the NIME project at the Norwegian Academy of Music, for example, I worked together with percussionist Kjell Tore Innervik and composer Ivar Frounberg, as well as some professional glass experts, to create a series of glass instruments. Here we had a fairly traditional division of labour, in which the glass blowers made the physical instruments, I developed the electronics, Ivar composed and Kjell Tore performed.
http://www.innervik.no/pages/research/projects/glass/glass_instruments_NIME2010.html

During my collaboration with violinist Victoria Johnson for the piece Transformation, we worked very closely together both in designing the physical setup as well as programming the sound engine and interaction, and creating the piece. So even though I did most of the programming and she did the performance, it was very much a collaborative effort from beginning to end.
http://www.arj.no/2013/01/08/transformation-cmj/


That is also the case for the micro-movement project Sverm, in which I worked very closely together with two musicians, two dancers and a light designer. Here we can really talk about creating a meta-NIME, since the performance space itself, also became the “instrument” that we performed in/with (mainly by standing still).
http://www.uio.no/english/research/groups/fourms/projects/sverm/index.html

 

What are the main challenges in designing NIMEs?
Mappings! I have for a long time worked from the idea that mappings need to be good perceptual couplings between actions and sounds. This is something that is violated all the time (imagine controlling a continuous violin sound from a discrete key), and it has become a mantra of mine to always say that we need to start with looking at the affordances of our instruments when creating mappings to sound features. Or, to create physical controllers that match the imagined sounds. It sounds simple, but is very hard.

 

 “The NIME we create are not used outside the academey”. What do you think about this statement?
I think that is partly true, but it varies from person to person. The statement is certainly true for my own work, but I am not sure if I see it as a problem. To me, building NIMEs is part of my research activity, both as a *music researcher* focusing on creating knowledge, but also as a *research musician* focusing on creating art. Thus most of my projects result in a physical instrument, software, concerts and academic papers.

Also through my teaching, I see that many of my own ideas inspire students to create NIMEs that they certainly take out of academia and into their regular artistic activities. That way, my own academic work helps feed the underground music scene, and may, hopefully, also lead to some commercial devices at some point in the future.