Commentary

The Editorial Purview Of Algorithms

Algorithms oversee a lot of what we see on screens: the TV shows suggested by Netflix, the Web pages deemed winners by Google, and the posts deemed worthy by Facebook. 

Why? The proliferation of personalized content, and the inadequacy of human-based content programming, mandated that machines would help us choose what content to consume. 

In the face of a deluge of consumer-generated content (social media mainly), our necessary adaptation is computer-generated editorial: an algorithm. 

The stakes are high, because content and advertising ultimately influence our purchases and votes. The algorithms can sift through billions of posts or videos or photos in a second, but the goals (yes, algorithms have goals) are fine-tuned to line the pockets of their owners. For example, paid search’s goal is to get you to click on an ad.

Smells Like Spin Spirit 

Everything has spin, but you can generally believe content that is regulated to be true. Advertising might be the only type of content that is truthful by law. 

advertisement

advertisement

We have laws called “truth in advertising,” while truth in political advertising is out of control.  With editorial content (most web pages), what you should believe depends on the integrity and reputation of the editors. The essence of a publishing brand is the equity in the editorial, but biases exist even in quality editorial, and they should. It remains incumbent on the reader to sort out gaslight from daylight. 

Luckily, many of us have been educated. Education teaches critical thinking; it helps us distinguish fact from fiction.  More importantly, education teaches us how to weigh the risks of believing the wrong thing. 

For example, “Tide’s in, dirt’s out,” a 50-year-old tagline, is generally true. You could make the same claim about water, of course, but the basic idea is true.

Importantly, though, believing that statement carries little risk. By contrast, believing fake statistics that support a pillar of white supremacy is obviously harmful. 

Algorithms are too dumb to see the difference; they may be fast, but are not the sharpest tools in the shed, AI or otherwise.  So, while algorithms have a lot of power, they are deeply flawed, non-transparent, and inherently greedy. 

The Role of Ad Platforms

Access to targeted communications, now ubiquitous, enables the activation of virtually any agenda. Platforms like Facebook have created the capability for anyone to send any message, targeting anyone at any time, inexpensively. 

But algorithms determine who sees what. Forget editorial. Now we have algo-torial!  Our society is defenseless against algo-torial because algorithms can’t be put in jail, fined, or identified. There is no accountability. 

To reference the classic limit on free speech, can an algorithm yell “fire” in a crowded theater? Yes, because it decides who gets what message.  Worse, the algorithm operates in an environment of moral and legal ambiguity.   

There is currently nothing stopping Facebook from deciding to delete (or amplify!) messages that promulgate racist points of view, or from banning the people who posted those messages. After all, what are platforms’ social responsibility to do — or not do  — with respect to content that passes across their grid? 

Scarily, nobody knows — not Congress, not the justice system, not the administration. 

Congress suffers from paralysis. After all, regulation might limit members' own use/abuse of social advertising in re-election. The administration, at the nadir of narcissism, is frustrated that searches for “Trump” yield negative articles. The justice system has their hands tied until some law is violated. So, basically, anyone can say anything to anyone until further notice. 

Democracy may need a little tune-up to handle all this. What might that look like?

We already know that editors have the right to include or exclude any fact or point of view. But we do not know about any obligation, human or algorithm,regarding presentations of facts, or for that matter, limitations on computer programs to modify audience or content. 

The First Amendment is rather a soft safeguard. What other strictures contain or constrain those with an agenda and the infinite toolset of today’s media?

Fraud requires proof of deception for gain. Is getting elected a kind of gain? And where does accountability sit if someone, or some thing, fraudulently supports a point of view that causes a riot or swings an election? 

Congress should learn about, and legislate, the question of how ubiquitous information, distributed by algorithms one-to-one, intersects with free speech and social responsibility. 

Advertisers fear they might be funding terrorism. Instead, they might, via social networks, be funding the implosion of democracy. 

3 comments about "The Editorial Purview Of Algorithms".
Check to receive email when comments are posted.
  1. James Smith from J. R. Smith Group, October 4, 2018 at 9:48 a.m.

    Nice thought-piece Ted!

  2. Paula Lynn from Who Else Unlimited, October 4, 2018 at 8:17 p.m.

    How about declaring or whatever works Google and all on online publishers and responsible for what is published on their sites ? How that is done proficiently and correctly is in their laps.

  3. Ted Mcconnell from Independent Consultant replied, October 4, 2018 at 11:45 p.m.

    That would be interesting. In MZ's congressional testimony he was very careful to say that FB is not a publisher. One might imagine this is fair because after all, it was a person who said something, not FB. However, the algorithm in effect published it to an audience. So the algo was acting like a publisher. If it quacks like a duck .... 

    It does not seem fair to collect $Billions based on content created by "consumers", and bear no responsibility for any outcome of anyone seeing it. It also does not seem fair that they should have all my data, and then gift it to criminals by virtue of sloppy security procedures. 


Next story loading loading..