Tangibility of the Digital - Die Fühlbarkeit des Digitalen
: Jussi Ängeslevä
Bodies, Mnemonics and Machines
What happens, if a creative mind grapples with the problem of displaying digital data outside the computer in such a way that they fit seamlessly into the analogue world of our everyday perception and imagination and that we can deal with digitally generated processes quite naturally?
Jussi Ängeslevä's projects have received global recognition many times and have been acquired for important collections. For the most part they are traded under the label of "interactive art". In short texts, Ängeslevä presents a selection of his projects, which on the one hand allow a path to be recognised where he grappled with the problem of interaction. On the other hand, they give us a good insight into his highly individual way of dealing with the subject of the interaction of the digital world and analogue action - sometimes ironically, sometimes politically and sometimes in an explanatory fashion. In the process, he neither avoids a "fake" nor does he believe that digital technology is all-powerful. Ängeslevä was born in Finland and also successfully completed his first course of study there, but most recently, since his time at the Royal College in London, the world has been his home. We met him in Berlin, where he still works for Art+Com as a senior concept designer and where he holds the office of visiting professor at the Universität der Künste (University of the Arts) and also still does some teaching "on the side" in China.
A project resulting from a challenge to design a museum interface, making complex data more meaningful to a non-expert user. Having developed projects in the field of merging physical and virtual environments, Body Scanner was a natural continuation of. By the time, I was researching on interesting content for a museum exhibit, and stumbeld to the Visible Human Database at the Science Museum London. The database, consisting of high definition scans of the cross-sections of human body (one male and one female), and created from deep-frozen and sliced dead bodies, was a perfect challenge to develop a more intuitive interface: at the museum at the time, the data was shown simply as a video loop one could play or pause with a toggle of a button.
I decided that to make the data most meaningful, it would have to be mapped to the body of the visitor him/herself. Mapping on the body, however, if done with the visual channel, would be rather cumbersome as an interface, the user having to try to look at him/herself, hence leaving large areas invisible. After a series of studies of different possiblities, I ended up using proprioception instead as the connecting factor: The user would move a "clipping plane" in physical space, and then see a representation on screen of that clipping plane as it slices through the database.
My first prototypes proved this approach very effective, creating the illusion you are really looking insided your own body. In the end, it turned out to be a bit too effective, as maybe 30 percent of the audience truly believed they saw their own innards. And this of course was beside the point, it was supposed to be the contextualising factor in a scientific database. Moreover, when talking with the head of Design from Science Museum afterwards, despite really enjoying the installation, he said that they could never show it in the museum, as it was seemingly "lying", which is something Science Museum cannot do.
The development of the Body Scanner changed my way of seeing the importance of the embodiment, and was a fundamental influence in the development of the Body Mnemonics concept, initiated at the Royal College of Art, and progressed to hardware and software development at the Media Lab Europe. Looking at the meaning we associate with our bodies, from the physical as well as emotional and cultural perspective and using it as a "memory scaffolding", Body Mnemonics began to take shape. During my research, I stumbled into various early medieval mnemonic devices, where different physical or cultural spaces were used by orators and others to recall long poems and other information. For example, Gesualdo describes in his Plutosofia a method where the ancestors of Jesus Christ can be associated to locations on the body, and hence recalled easily.
As an interface design concept for the modern world, the miniaturisation of technology, together with the rigidity of the body proved to be the most appropriate concept where to merge these ideas: Using the body as a memory palace (also called "the method of loci and imagines") for organising information in a portable device, such as a mobile phone. The users could associate any information or functionality of their portable device to any location on their body, creating meaningful but personal associations for easy recall, and quick access. The user interactoin would simply be to move the device to the desired location where the information would be stored or subsequently, from where to recall it.
As a cognitive model, this approach proved very effective. After having conducted several studies on the breadth of meaningful associations one could have in their body space, as well as recall performance of previously stored personal items, we were certian that the approach would work, should we be able to provide elegant technological solution.
For the hardware requirement, a key feature was that everything would have to be self contained in the portable device itself. The users would not be requried to carry any additional devices to facilitate the tracking, no markers on the body or clothing. Having evaluated the options, we decided to proceed with inertial sensing, equipping the portable device with 3 axial accelerometer, being able to track gestures.
For the software tracking, the inherent limits to the kinds of motions one can do with a single wave of an arm turned out to be very useful in terms of categorising the motions, and creating robust and rapid enough tracking system that could be used to store and retrieve locations on the body space.
To summarise, the mental model for the user was to think of body locations as the storage, whereas the tracking was solely based on relative inertial measurements. This discrepancy, when explained to the users, however, caused no confusion or frustration once understood.
By the time the work took place, inertial sensing was non-existent in handheld devices, yet we correctly predicted that they'd become commonplace in near future. Hence Body Mnemonics has become more of a software question, and perhaps even more interestingly, cultural consideration. Would one begin to use portable devices in public place with gestures? Even if the gesturing would provide true usability improvements to alternatives?
Aichi Expo - BMW engine evolutionary optimisation
Whilst working at ART+COM in Berlin, I had an opportunity to design interfaces for several interactive installations for the German pavillion at the World Expo in Aichi in Japan. One such installation dealt with evolutionary optimisation of the cylinder chassis. The material provided by the client illustrated an animation sequence of gradually lighter, but still equally strong chassis. For the installation, the challege was to create tangible interface to navigate through this data. Having learned from the experience with the Body Scanner installation, combining physical space with digital information, I decided to approach this project the same way. In our initial sketches, a moving display was placed between two glass cubes holding sculptures of the start and the end states of the evolution. the double sided screen would move between these to display the animation. Due to engineering constraints, namely moving the screen perpendicular to its plane, the installation was not realistic to execute this way. The final iteration of the installation used a clever physical property, allowing movement along the display's plane, yet still giving understandable motion from the back side of the first sculpture to the front side of the 2nd. by aligning the displays together with the display in such way that the display would slide sideways between them, the motion along the horizontal axis of the screen was perceived as motion perpendicular to the screen.
The physical space, hence was used to skew the virtual coordinate system to technological constraints.