
I am writing a story about how our brains perceive and process  content across various media screens, and I can't help
thinking about  what my brain is doing at this very moment, even as the words I write  appear on the screen of my computer. Or, for that matter, what your  brain will do when you read them in this
magazine, or on some of the  screens this story will eventually appear on. That's because people who  understand how these things work are explaining them to me, and making  me cognizant of things
people don't normally think about, but which  fundamentally shape the way we think, feel, write, read, or do anything  related to media.
Even as I wrote that opening paragraph, areas of
my brain were firing millions of neurons that were telling me things,  most of which were not relevant to this story, but some of which formed  the thoughts I just wrote. Millions more neurons are now
firing in your  brain, telling you a bunch of other stuff, including how to think about  what I just wrote.
advertisement
advertisement
The thoughts both of us (or perhaps all of us,  if this is on a public screen) are
having are coming from the prefrontal  cortex and language areas of our brains, says CEO and chief scientist  at Boston-based Innerscope Research Dr. Carl Marci, one of the experts  in a relatively
new branch of neuroscience that is trying to understand  how communication is influenced by the way our brains process the  information we get from media, and how that, in turn, influences the way  we
behave as consumers of advertising, brands and media content.
"This  is typical activity that requires focused attention," Marci says,  describing the process you are experiencing while reading
this article  as a sort of a "seesaw effect" in which one part of your brain - the  part that processes visual information and language - begins working  harder, while the emotion centers begin
working less. 
One of the  remarkable things about our brains is that I can make that seesaw flip  in reverse simply by writing
something that triggers a strong emotional  response. For example, if you are passionately conservative, I could  write something that might incite your emotions - like, say, "Sarah  Palin is an
idiot." Okay, so maybe that reference didn't exactly cause  the neurons in the emotion center of your brain to sparkle like  fireworks, but that is more a function of how good a writer I am than  the
nature of the medium. And that's one of the key points  neuroscientists like Marci say we need to keep in mind when considering  the ability of our brains to process content on various screens: The
type of content being displayed on those screens is an important  variable.
In fact, everything is - including when, where and why  you are experiencing a particular piece of content on a
particular media  screen, what's going on in the environment at that time, and what kind  of memories and emotional baggage you bring to the viewing experience.  In fact, there likely are far too many
variables to ever be  scientifically modeled - at least with our current brain power - but Dr.  Marci, and his partner, senior scientist Caleb Siefert, developed a  model to explain some fundamental
variations in how we experience  content across various screens.
They call it the Brand Immersion  Model, and it utilizes a simple curve plotted along a graph showing the  relationship between
two key determinants: the "flexibility" of the  content and screen experience (shown on the vertical X axis); and the  level of "immersion" you experience with the content and screen  experience
(shown on the horizontal Y axis).
By flexibility, Marci  and Siefert mean that a particular screen experience gives people more  options for their brains to engage in things other than the
content,  message or story appearing on the screen. Immersion  is the opposite, an  experience that fully immerses the users in what's being displayed on  the screen at that moment.
The
model may seem simple, but it's  not. That's because our brains are complex and the relationship between  those two variables - flexibility and immersion - isn't static. Remember  Marci's seesaw
metaphor? In fact, Marci and Siefert can literally show  that effect, utilizing MRI images showing how various screen experiences  generate neural activity in different areas of the brain
corresponding  to cognitive and emotional processes. 
Marci started thinking  about these things when he was doing research to understand the  communication process between doctors and patients
while working for  Massachusetts General Hospital in Boston (where he continues to serve as  director of social neuroscience), and certain areas of his brain became  stimulated enough that he made the
leap from the medium of doctors'  offices to the kind more normally associated with Madison Avenue. He founded Innerscope to apply some of that learning to the world  of marketing and media, utilizing
specialized biometric technologies  that can measure how humans react physiologically to media stimuli.
To  explain how brain activity can vary in ways that are important for  marketers and
media companies to understand, Marci gives me a relatively  simple, but painfully effective, example of how our brain's ability to  experience empathy works. Needless to say, empathy is a critical
component of how we relate to many forms of content and to brand images  and messages, so the illustration is a good one to keep in mind.
When  we experience pain, say, an electrical shock,
the neurons in specific  areas of our brain will "light up like a Christmas tree," Marci notes.  When we watch someone else being shocked in the same manner, the same  areas of our brain light up, but
in different magnitudes depending on  how much empathy we feel for the person being shocked. If it's a close  loved-one suffering, the neural activity is almost identical to  experiencing the shock
firsthand. If it's a stranger whom we have no  emotional connection to, there is some neural activity, but to a much  lower degree. 
In fact, it is even possible for me to trigger those neurons
simply by writing about the effects of someone being shocked, but Marci says I would have to write it in a way that describes the pain in vivid detail, which shifts the activity from the language
centers to the pain centers of your brain by tapping into your mirror neuron network - the part of your brain that generates empathic responses. It only works if what I am writing describes something
you can relate to emotionally.
Personal relevance is a big factor in triggering such emotional response. To illustrate this point, Marci explains his own reaction to the Sept. 11 terrorist
attacks, which he observed from his hometown of Boston. Because he had gone to school at Columbia University, and considers New York City a second home, Marci says he felt a great deal of empathy for
New Yorkers when the attacks took place. But when he traveled to San Francisco days later, he says he was "stunned" by how little the West Coasters seemed to care, and how much less it was being
played in their local media coverage. "Proximity," a form of geographic relevance, "was a big factor," he says.

Proximity can also play an important role in how we perceive content displayed on a screen. Not so much the proximity of the
screen to our eyes - though that clearly is a factor too - but the way we perceive the proximity of things on a screen. As a rule, Marci says size matters - even relative size. When something appears
larger on a screen, our brains perceive them as being closer. When they are small, we perceive them to be more distant. As a result, content displayed as different sizes on a screen - even a very
small handheld screen - can produce markedly different emotional responses. 
If you see a bear two miles away, you feel one way about it. If you see a bear two feet away, you feel very
differently about it," he says.
In fact, when Innerscope monitors the biometrics - heart rate, perspiration, eye movement, and other physiological responses - it can actually measure how our
bodies, and minds, respond to changes in screen content. And when something appears larger and closer on a screen, pulse rates accelerate, and viewers become more emotionally aroused. That's important
for programmers and marketers to understand, Marci says, because when our pulse rates accelerate and we become more emotionally charged, we also react more intensely to the content being displayed on
a screen.
"Things that are liked tend to be liked more. Things that are disliked are disliked more," he explains. 
It's not just the size of a screen, its proximity, or the size of content
appearing on it that determines how our brains react to them. Not only do screens come in many sizes and focal points, but also with different levels of interactivity. And the more you interact with a
screen, the more likely you are to be using what Marci and Siefert describe as flexibility, and the less likely you are to be emotionally immersed in the content.
To illustrate this point, Marci
says to think about watching a powerful drama, or a visually stunning film like Avatar on two different screens: a giant 3-D Imax screen, and a tiny handheld smartphone screen. Watching the
film on the big screen, with little around to distract you, you are much more likely to be immersed in the content. Watching it on an iPhone with the opportunity to run other applications, text
someone, or be distracted by something in the peripheral surroundings of the real world, will give you a much more flexible experience. While that example illustrates Innerscope's Brand Immersion
Model perfectly, Marci says that when it comes to content, not all things are created equal. It's not just size, but the inherent immersive or flexible nature of the content that matters.
Hold
the image of Avatar in your head, and think how your viewing experience might differ if you watched it on the 3-D Imax screen or the iPhone. Now introduce a very different form of content,
something that is highly flexible in nature, maybe a TV news show like PBS' NewsHour with Jim Lehrer. Now imagine how you might view that differently on the two screens. Needless to say, many of us
wouldn't necessarily want to watch Jim Lehrer on a big Imax screen, and given our druthers, probably wouldn't watch Avatar on an iPhone.
Screen experiences have become such an important
part of the way in which we consume and process media content, that we've actually become cognitively aware of those trade-offs when we watch certain forms of content. Siefert learned this firsthand,
when he took a friend to see the movie Iron Man in an Imax theater. "When we were leaving, my friend turned to me and said, 'Thank God, we saw that in the theater,' implying that he was
actually aware of how it enhanced his experience."
Siefert says we're not always cognitively aware of the role that screen sizes play on our perception and immersion with screen content, because
its effects are generally more sensory. "I recognize Tina Fey as Tina Fey whether I see her on a 16-inch screen or a gigantic movie screen," he says. "However, I'm still influenced by differences in
the bottom-up sensory experience and this may have subtle but important effects on my overall experience of what I'm seeing."
In fact, it is because we are not aware of some of these sensory
effects that they can have such a powerful effect on our brains. Punctuating the point, Siefert introduces a screen-related sense we haven't addressed so far - sound - which can have a strong
influence on how we experience what we see on screens, but which is often considered incidental to the full-motion action.
"We often don't consider how music in a movie affects us, but it can
have a strong impact on how we feel," he explains. "Watching a Hitchcock movie with sound is an entirely different experience than watching it without."
Think about that.