A top Google executive spoke out Tuesday against online censorship, arguing that protecting children from viewing objectionable material is the family's job, not the government's.
"If we try to pass individual or family responsibility on to the government, we risk making a big mistake," said Elliot Schrage, Google's vice president of global communications and public affairs.
"We take this stuff incredibly seriously. If we mess this up, we're doomed," he added, speaking Tuesday afternoon in New York at a panel of experts and executives convened by Commonsense Media
and the Aspen Institute to address the question "Does the Internet Change Everything?"
Other panelists agreed with Schrage that there was no need for new regulations aimed at keeping children
from viewing certain content. Governmental involvement, they said, should be limited to a policing a few well-defined problem areas, like child predators.
"Most of us would prefer not to have the
law intrude in the new digital media," said Howard Gardner, a professor of cognition and education at the Harvard Graduate School of Education. But Gardner quickly added a warning: "If the Internet
doesn't self-police, it will happen through external forces... with results that none of us will like."
Liz Perle, the editor in chief of Commonsense Media, added that a survey by Commonsense
found parents believed the Internet was simultaneously their children's best educational resource--and the source of their most dangerous content exposure.
At the same time, the panelists
acknowledged that asking parents to police Internet usage was a tall order, given the speed of change and variety of activities available. Perle couched it in the most dramatic terms, noting the
problems posed by user-generated content: "The inmates are running the asylum. They're creating content now, and they're far out ahead of us."
But Perle also argued that schools can help parents
by teaching children to use the Internet safely: "We have Driver's Ed, but we don't have 'Internet Ed.'" In her impassioned conclusion, Perle emphasized: "The movement has to shift focus from external
control to internal tools" that allow children to navigate the Internet under their own steam.
Schrage suggested a "two-by-two" matrix for categorizing Internet interactions by the intent of
content users and producers. Schrage imagined four basic scenarios, recommending different remedies in each case.
First: when a child using the Internet isn't looking for objectionable content,
and doesn't encounter any, no action is needed. Second: when a child using the Internet isn't looking for objectionable content, but comes across it inadvertently, Schrage suggested that Internet
companies provide better tools for parents to control Internet usage--citing Google's "Safe Search" as an example.
Third: when a child is seeking objectionable content and finds it, Schrage said
"that's a place where we can offer tools," but warned the solution must come primarily through "families and parenting." Finally, in a situation where a child using the Internet isn't seeking
objectionable content, but someone deliberately forces it on them, "that's exploitation--and then the government definitely should be involved." But such interactions are just a fraction of the whole,
added Schrage, emphasizing that "one-size-fits-all is a terrible idea."