I’d like you to give me your undivided attention. I’d like you to -- but you can’t.
First, I’m probably not interesting enough. Secondly, you no longer live in a world where that’s possible. And third, even if you could, I’m not sure I could handle it. I’m out of practice.
The fact is, our attention is almost never undivided anymore.
Let’s take talking, for example. You know: old-fashioned, face-to-face, sharing the same physical space communication. It’s the one channel that most demands undivided attention. But when is the last time you gave 100% of your attention to a conversation?
I actually had one this past week, and I have to tell you, it unnerved me. I was meeting with a museum curator, and she immediately locked eyes on me and gave me the full breadth of her attention span. I faltered. I couldn’t hold her gaze. As I talked, I scanned the room we were in. It’s probably been years since someone did that to me. And nary a smart phone was in sight.
If this is true when we’re physically present, imagine the challenge in other channels.
Take television, for instance. We don’t watch TV like we used to. When I was growing up, I would be verging on catatonia as I watched the sparks fly between Batman and Catwoman (the Julie Newmar version -- with all due respect to Eartha Kitt and Lee Meriwether.)
My dad used to call it the “idiot box.” At the time, I thought it was a comment on the quality of programming, but I now know realize he was referring to my mental state. You could have dropped a live badger in my lap, and I wouldn't have batted an eye.
But that’s definitely not how we do things now. A recent study indicates that 177 million Americans have at least one other screen going -- usually a smartphone -- while they watch TV. According to Nielsen, there are only 120 million TV households. That means that 1.48 adults per household are definitely dividing their attention among at least two devices while watching "Game of Thrones."
My daughters and wife are squarely in that camp. Ironically, I now get frustrated because they don’t watch TV the same way I do: catatonically.
Now, I’m sure watching TV does not represent the pinnacle of focused mindfulness. But this could be a canary in a coal mine. We simply don’t allocate undivided attention to anything anymore. We think we’re multitasking, but that’s a myth. We don’t multitask; we mentally fidget. We have the average attention span of a gnat.
What price are we paying for living in this attention-deficit world?
Well, first, there’s the problem when we do decide to communicate. I’ve already stated how unnerving it was for me to have someone’s laser-focused attention. But the opposite is also true. Research studies show our ability to communicate effectively erodes quickly when we’re not getting feedback that the people we’re talking to are actually paying attention to us. Effective communication requires an adequate allocation of attention on both ends; otherwise it spins into a downward spiral.
It’s not just communication that suffers when we're attention-deficit. It's hard to focus on anything when we're continually tempted to pick up our smartphone and check it.
Boise State Professor Nancy Napier suggests a simple test to prove the price of our mythical multitasking. Draw two lines on a piece of paper. While having someone time you, write “I am a great multitasker” on one, then write down the numbers from 1 to 20 on the other.
Next, repeat this same exercise, but this time, alternate between the two: write “I” on the first line, then “1” on the second, then go back and write “am” on the first, “2” on the second, and so on. What’s your time? It will probably be double what it was on the first part.
Every time we try to mentally juggle, we’re more likely to drop a ball.
Attention is important. But we keep allocating thinner and thinner slices of it. Why? A big part is the smartphone probably within arm’s reach of you right now, and something called intermittent variable rewards. Slot machines use this concept -- and that’s probably why slot machines make more money in the U.S. than baseball, movies and theme parks combined.
Tristan Harris, taking technology to task for hijacking our brains, explains: “If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.”
Your smartphone is no different. In this case, the reward is a new email, Facebook post, Instagram photo or Tinder match. Intermittent variable rewards -- together with the fear of missing out -- make your smartphone as addictive as a slot machine.
I’m sorry, but I’m no match for all of that.
Of COURSE the Julie Newmar version.
But Josh, at least Julie made her presentations using a single "screen".
As for the idea that about half of all TV viewing time involves the coincidental use of another screen---usually the smartphone---you,ve got to be kidding. Nielsen has such data, why not ask it to verify this statement. The actual figure for all of the TV audience, not just a handful of technoelitists is probably around 5-10% and I'm not talking about once in a while but, in effect, everytime TV is viewed---which is the claim being made either directly or inferrentially. Common, Nielsen, let's get some facts on this.
Not about TV. Besides people falling into fountains trying to walk without paying attention, it will take us down. We do not learn as much as we need (your counting the number of times you lace your shoes does not count) and we don't pay attention and that is going to cost even more than it already has, more than we wil be willing to pay.
Excellent. As I have said in the past... Because we now have so many ways to reach consumers, it also means they have so many ways to ignore you. Just posted it on AdScam.
Ed you are missing a zero for each of your numbers...it's more like 50%-100% of consumers have their devices with them during Television viewing. It's an existential problem for the medium.
Michael, I'm talking about using other screens at the same time that a TV set is tuned in and, of course, during commercial breaks, not haveing the devices available nearby or in an adjacent room. While the research shows that avoidance via various means rises during commercials relative to program content, a significant percentage of the TV audience is still effectively reached when a commercial appears on the viewer's screen. Not 100%, not 75%, not 60%, but around 50% of the audience actually stays put and notes or even attentively views the average TV ad. And there's plenty of research that documents this. Your underlying assumption is that very few TV viewers watch the average commercial and most or all of them are, instead, using a smartphone or some other device whenever any commercial starts it's spiel. That's simply not true. If anyone is interested in the research about commercial exposure---including a number of very recent observational studies---they will find this info along with a full discussion of the subject in our upcoming annual, "TV Dimensions 2018".
Ed, to Gord's point, I don't know a single person who isn't within arms length of their iPhone.....it's sad actually. I must be an anomally then, IF I am watching linear television (and not Netfix or Amazon Prime) I can honestly say that 75% of the time a commercial break arrives I am on my iPhone/iPad reading, researching etc. It has to be higher than 5-10%.
Michael, I, too, mute the sound or otherwise avoid many more commercials than the average TV viewer who, we must remember, is not a college grad and tends to be over the age of 55 and not employed outside of the home. Others watch, but less often by a wide margin. So, it would appear that both you and I are not "typical". In fact, I have reduced my daily diet of TV fare by at least 50% over the past five years and spend much more time online than with TV.
The fact remains, that a surprising percentage of the average TV telecast's audience consists of heavy viewers and as these tend to like all forms of TV content, they are far more inclined to consume the commercials than is realized. This shows up in all of the ad awareness and ad recall research.
An interesting point from an advertising perspective is the fact that if you and I are typical of a light viewing, commercial avoiding, segment, we see fewer ads for just about all brands, including those competing with eachother. Consequently, a typical TV brand's" share of voice", is fairly constant, relative to its direct competition. Its ads are seen less often and so are those of rival brands by light viewers, including many who try to avoid ads more often than the norm. Result: its share of market ( sales ) posture remains about the same in the light viewing group as it is among frequent viewers. The latter see your ad far more frequently but they also see the rival brands' commercials more often---so the disparity between the heavy and light viewers in terms of ad avoidance doesn't matter so much. It all tends to cancell out, on a competitive basis.
Michael, sadly like myself, you appear live in that techo-bubble as well.
We're probably 10% of the population though. So your 75% of the time is probably in the ballpark. 75% of 10% is 7.5% - making Ed's 5%-10% pretty much right.
I know that one of the US college's did the study of what media professionals thought was 'usual' and then compared it to the validated measurement currencies. I think it might have been one of Mike Bloxham's studies. Boy are we the outliers or what !
Since I believe that we must always examine any available research on such an important subject, I reread the published findings of the Council For Research Excellence's late 2016 observational study of the evening viewing of 126 adults, using cameras to determine where the viewers' eyes were focused. Bearing in mind that this is merely a piece of exploratory research, with a tiny sample, the cameras noted that during program content the viewers' eyes were on the TV screen 59% of the time and this dropped to 39% when commercials ran. In the latter case, 48% of the viewers were classified as "distracted", with other screens accounting for about half of the distractions--or 23-25%. Since the totals do not add to 100%, I assume that the remainder---13%---were not in the room.
What a full fledged national study with a much larger sample would show, especially if it included all of the other TV dayparts, not just the evening, is open to question, but it certainly would be very interesting to see such findings. It would also be imperative to add some indicator of ad impact---like verified recall and message registration, as there is more to TV viewing than is revealed by an average second "eyes-on" finding. It is likely that many viewers will shift their visual attention back and forth during portions of content; in the case of commercials it is logical to expect that many of the viewers who did not have their eyes on-screen per second, did watch parts of the commercials and got something from them. The same point applies to program content.
Average second metrics indicate the degree of visual focus tonnage, but not, necessarily, whether the viewer watched enough of the content to have a satisfying experience. In other words, I would consider the 39% eyes-on finding in this CRE study---if replicated by a much larger project---to represent the average extent of fully attentive viewing. Quite a few additional viewers would no doubt, be found to be "partially attentive"---enough to recall some or more of the message and get its basic point. Naturally, the fully attentive segment would perform better than the partially attentive in terms of ad recall, but the latter would certainly not score at zero.
Thanks Ed. That is VERY handy.
But do you know what the 'gaze threshold' was? The normative standard is 200ms (a-fifth of a second).
To put that into perspective, the batter is 60 feet from the pitcher who pitches at 100 mph. The ball gets to the batter in 400ms, In that time the batter has to see the ball, gauge the speed and trajectory of the ball and then hit a ball that is 2.9 inches in diameter with a bat that is 2.75 inches in diameter. A somewhat more difficult task than watching a TV.
When we built MOVE - our Out-Of-Home audience measurement system we used 200ms. We found that eliminated around one-third of passers-by who weren't "eyes on" the sign long enough. From memory the next third were "eyes on" for around 3 seconds and the rest (e.g. pedestrians, car passengers) were 3+ seconds.
What would be REALLY nice would be similar studies for newspapers, magazines, PCs, tablets, smartphones and even cinema. My suspicion is that TV's 60% wouldn't be that different from the other 'attentive/reading' media, but I suspect cinema would top the list (apart from oldies like me who have been known to nod off).
John, your OOH findings seem very close to those I've seen in the States. As for other media, I've never encountered anything of this sort for newspapaers and the only eye tracking research I am familiar with for magazines involved very tiny sample, laboratory, designs where the "reader" wears a bulky helmet-like configuration to accomodate the tracking mechanism---hardly a typical situation.
As it happens, we make estimates of probable ad exposure levels for all media, including digital, as well as probable ad recall norms in our upcoming edition of "Intermedia Dimensions 2018". Generally speaking, we agree that the magazine full page four-color noting level is probably equal to TV, with radio and newspapers rating lower. The digital situation is complicated by the viewability issue. If a video ad is fully visible---ad runs from start to finish on the screen--- we rate its notability as fairly high, but not as high as TV, due to smaller screens and the presence of other material on many digital ad pages.