
A majority of Americans believe journalists play a
crucial role in society, though many think their influence is declining, according to a new Pew Research Center study.
The survey of more than 9,000 U.S. adults found that 59% see journalists as
“extremely” or “very” important to society’s well-being. Yet …
Reminder: You are seeing this premium content because you are a subscriber to MediaPost's Research Intelligencer and/or a member of the Center for Marketing & Media Research. This content cannot be viewed by non-subscribers/non-members.