As we already know from working with him, where Herigstad is involved anything is possible and everything is worldwide. This interview was conducted via email while he was in Mumbai.
It's been a year since you guest-edited MEDIA. What have you been up to?
The world of digital media keeps shifting its axis and I've been continuing to spend time observing the changes, as well as working on several intriguing projects around [over-the-top] TV. And, I've been enjoying doing that in the UK and in Europe.
What, if anything, has changed about the way you think of media and design since then?
I've been thinking a lot about the role of editorial in dynamic media design and even what it means to author things. A collection of juxtaposed ideas, words, images [and] video can assemble to create a meaningful consumption experience. Apps like Flipboard, for example, begin to demonstrate how unique media content experiences can gather around certain brands, social groupings or people. I can see the increasing importance of metadata, the stuff that is attached to media that allows for meaningful assemblages to occur.
If you were guest-editing now, is there anything you would do differently?
I would certainly have liked to create a digital iPad motion version of the magazine, or at least a part of it, to explore what that might feel like.
This year's issue focuses on the brain, and
how people connect cognitively, emotionally and physiologically to various screens. How much do you think about the neural process when designing for screens?
I think a lot about how to make navigating content intuitive. I think about the very basic and primitive elements of screen design, represented by direction and spatial placement and how we can't stop using our hands to communicate a lot of what we feel. As we have now moved to an era of gesture, direction and natural hand movements are becoming part of the language of navigation. The interaction design task is partly to assign logical meaning to movement and direction.
What do you think about the potential for emotive and/or neural interfaces that connect media directly to our brains?
I think there are a number of ways to tap the many layers of information we give off constantly, which can reveal our mood, how we feel, what we like, what we don't like. I've worked with the Emotiv headset that can detect mood and disposition so that a game or other media context can react accordingly. In Emotiv's case, the headset detects and interprets brain-wave patterns. But more and more sensitive cameras and sensors can also "read" our mood and reaction from body language, how we move and other natural responses we are emitting constantly.
I've recently been giving thought to the new term "gamification." And what I think is that even if some device could detect all our moods, reactions and wishes, it might still be more interesting to engage in speaking, moving our hands and "interacting" with people or objects to make our experience more engaging and fun. Engagement is probably the key word here.